Dec 16 12:16:17.373947 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Dec 16 12:16:17.373971 kernel: Linux version 6.12.61-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT Tue Dec 16 00:05:24 -00 2025 Dec 16 12:16:17.373981 kernel: KASLR enabled Dec 16 12:16:17.373987 kernel: efi: EFI v2.7 by EDK II Dec 16 12:16:17.373993 kernel: efi: SMBIOS 3.0=0x43bed0000 MEMATTR=0x43a714018 ACPI 2.0=0x438430018 RNG=0x43843e818 MEMRESERVE=0x438351218 Dec 16 12:16:17.373999 kernel: random: crng init done Dec 16 12:16:17.374006 kernel: secureboot: Secure boot disabled Dec 16 12:16:17.374012 kernel: ACPI: Early table checksum verification disabled Dec 16 12:16:17.374018 kernel: ACPI: RSDP 0x0000000438430018 000024 (v02 BOCHS ) Dec 16 12:16:17.374026 kernel: ACPI: XSDT 0x000000043843FE98 000074 (v01 BOCHS BXPC 00000001 01000013) Dec 16 12:16:17.374032 kernel: ACPI: FACP 0x000000043843FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:16:17.374038 kernel: ACPI: DSDT 0x0000000438437518 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:16:17.374044 kernel: ACPI: APIC 0x000000043843FC18 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:16:17.374064 kernel: ACPI: PPTT 0x000000043843D898 000114 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:16:17.374077 kernel: ACPI: GTDT 0x000000043843E898 000068 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:16:17.374083 kernel: ACPI: MCFG 0x000000043843FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:16:17.374090 kernel: ACPI: SPCR 0x000000043843E498 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:16:17.374096 kernel: ACPI: DBG2 0x000000043843E798 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:16:17.374102 kernel: ACPI: SRAT 0x000000043843E518 0000A0 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:16:17.374109 kernel: ACPI: IORT 0x000000043843E618 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:16:17.374115 kernel: ACPI: BGRT 0x000000043843E718 000038 (v01 INTEL EDK2 00000002 01000013) Dec 16 12:16:17.374121 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Dec 16 12:16:17.374128 kernel: ACPI: Use ACPI SPCR as default console: Yes Dec 16 12:16:17.374135 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000-0x43fffffff] Dec 16 12:16:17.374142 kernel: NODE_DATA(0) allocated [mem 0x43dff1a00-0x43dff8fff] Dec 16 12:16:17.374148 kernel: Zone ranges: Dec 16 12:16:17.374154 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Dec 16 12:16:17.374160 kernel: DMA32 empty Dec 16 12:16:17.374167 kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Dec 16 12:16:17.374173 kernel: Device empty Dec 16 12:16:17.374179 kernel: Movable zone start for each node Dec 16 12:16:17.374186 kernel: Early memory node ranges Dec 16 12:16:17.374192 kernel: node 0: [mem 0x0000000040000000-0x000000043843ffff] Dec 16 12:16:17.374199 kernel: node 0: [mem 0x0000000438440000-0x000000043872ffff] Dec 16 12:16:17.374205 kernel: node 0: [mem 0x0000000438730000-0x000000043bbfffff] Dec 16 12:16:17.374213 kernel: node 0: [mem 0x000000043bc00000-0x000000043bfdffff] Dec 16 12:16:17.374219 kernel: node 0: [mem 0x000000043bfe0000-0x000000043fffffff] Dec 16 12:16:17.374225 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x000000043fffffff] Dec 16 12:16:17.374232 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Dec 16 12:16:17.374238 kernel: psci: probing for conduit method from ACPI. Dec 16 12:16:17.374247 kernel: psci: PSCIv1.3 detected in firmware. Dec 16 12:16:17.374256 kernel: psci: Using standard PSCI v0.2 function IDs Dec 16 12:16:17.374263 kernel: psci: Trusted OS migration not required Dec 16 12:16:17.374269 kernel: psci: SMC Calling Convention v1.1 Dec 16 12:16:17.374276 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Dec 16 12:16:17.374283 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Dec 16 12:16:17.374290 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Dec 16 12:16:17.374297 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x2 -> Node 0 Dec 16 12:16:17.374303 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x3 -> Node 0 Dec 16 12:16:17.374311 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Dec 16 12:16:17.374318 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Dec 16 12:16:17.374325 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Dec 16 12:16:17.374332 kernel: Detected PIPT I-cache on CPU0 Dec 16 12:16:17.374339 kernel: CPU features: detected: GIC system register CPU interface Dec 16 12:16:17.374346 kernel: CPU features: detected: Spectre-v4 Dec 16 12:16:17.374352 kernel: CPU features: detected: Spectre-BHB Dec 16 12:16:17.374359 kernel: CPU features: kernel page table isolation forced ON by KASLR Dec 16 12:16:17.374366 kernel: CPU features: detected: Kernel page table isolation (KPTI) Dec 16 12:16:17.374373 kernel: CPU features: detected: ARM erratum 1418040 Dec 16 12:16:17.374380 kernel: CPU features: detected: SSBS not fully self-synchronizing Dec 16 12:16:17.374387 kernel: alternatives: applying boot alternatives Dec 16 12:16:17.374395 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=756b815c2fd7ac2947efceb2a88878d1ea9723ec85037c2b4d1a09bd798bb749 Dec 16 12:16:17.374403 kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Dec 16 12:16:17.374410 kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Dec 16 12:16:17.374416 kernel: Fallback order for Node 0: 0 Dec 16 12:16:17.374423 kernel: Built 1 zonelists, mobility grouping on. Total pages: 4194304 Dec 16 12:16:17.374430 kernel: Policy zone: Normal Dec 16 12:16:17.374437 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Dec 16 12:16:17.374443 kernel: software IO TLB: area num 4. Dec 16 12:16:17.374450 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Dec 16 12:16:17.374458 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Dec 16 12:16:17.374465 kernel: rcu: Preemptible hierarchical RCU implementation. Dec 16 12:16:17.374473 kernel: rcu: RCU event tracing is enabled. Dec 16 12:16:17.374480 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Dec 16 12:16:17.374487 kernel: Trampoline variant of Tasks RCU enabled. Dec 16 12:16:17.374494 kernel: Tracing variant of Tasks RCU enabled. Dec 16 12:16:17.374500 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Dec 16 12:16:17.374507 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Dec 16 12:16:17.374514 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Dec 16 12:16:17.374521 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Dec 16 12:16:17.374528 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Dec 16 12:16:17.374536 kernel: GICv3: 256 SPIs implemented Dec 16 12:16:17.374542 kernel: GICv3: 0 Extended SPIs implemented Dec 16 12:16:17.374549 kernel: Root IRQ handler: gic_handle_irq Dec 16 12:16:17.374556 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Dec 16 12:16:17.374563 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Dec 16 12:16:17.374569 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Dec 16 12:16:17.374576 kernel: ITS [mem 0x08080000-0x0809ffff] Dec 16 12:16:17.374583 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100110000 (indirect, esz 8, psz 64K, shr 1) Dec 16 12:16:17.374590 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100120000 (flat, esz 8, psz 64K, shr 1) Dec 16 12:16:17.374597 kernel: GICv3: using LPI property table @0x0000000100130000 Dec 16 12:16:17.374604 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100140000 Dec 16 12:16:17.374611 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Dec 16 12:16:17.374619 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 16 12:16:17.374626 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Dec 16 12:16:17.374632 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Dec 16 12:16:17.374639 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Dec 16 12:16:17.374646 kernel: arm-pv: using stolen time PV Dec 16 12:16:17.374654 kernel: Console: colour dummy device 80x25 Dec 16 12:16:17.374661 kernel: ACPI: Core revision 20240827 Dec 16 12:16:17.374669 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Dec 16 12:16:17.374677 kernel: pid_max: default: 32768 minimum: 301 Dec 16 12:16:17.374684 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Dec 16 12:16:17.374691 kernel: landlock: Up and running. Dec 16 12:16:17.374698 kernel: SELinux: Initializing. Dec 16 12:16:17.374705 kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 16 12:16:17.374713 kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 16 12:16:17.374720 kernel: rcu: Hierarchical SRCU implementation. Dec 16 12:16:17.374727 kernel: rcu: Max phase no-delay instances is 400. Dec 16 12:16:17.374736 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Dec 16 12:16:17.374743 kernel: Remapping and enabling EFI services. Dec 16 12:16:17.374750 kernel: smp: Bringing up secondary CPUs ... Dec 16 12:16:17.374757 kernel: Detected PIPT I-cache on CPU1 Dec 16 12:16:17.374764 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Dec 16 12:16:17.374772 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100150000 Dec 16 12:16:17.374779 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 16 12:16:17.374787 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Dec 16 12:16:17.374795 kernel: Detected PIPT I-cache on CPU2 Dec 16 12:16:17.374816 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Dec 16 12:16:17.374828 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000100160000 Dec 16 12:16:17.374836 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 16 12:16:17.374843 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Dec 16 12:16:17.374850 kernel: Detected PIPT I-cache on CPU3 Dec 16 12:16:17.374858 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Dec 16 12:16:17.374866 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000100170000 Dec 16 12:16:17.374874 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 16 12:16:17.374881 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Dec 16 12:16:17.374888 kernel: smp: Brought up 1 node, 4 CPUs Dec 16 12:16:17.374896 kernel: SMP: Total of 4 processors activated. Dec 16 12:16:17.374903 kernel: CPU: All CPU(s) started at EL1 Dec 16 12:16:17.374912 kernel: CPU features: detected: 32-bit EL0 Support Dec 16 12:16:17.374920 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Dec 16 12:16:17.374927 kernel: CPU features: detected: Common not Private translations Dec 16 12:16:17.374935 kernel: CPU features: detected: CRC32 instructions Dec 16 12:16:17.374942 kernel: CPU features: detected: Enhanced Virtualization Traps Dec 16 12:16:17.374950 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Dec 16 12:16:17.374957 kernel: CPU features: detected: LSE atomic instructions Dec 16 12:16:17.374966 kernel: CPU features: detected: Privileged Access Never Dec 16 12:16:17.374974 kernel: CPU features: detected: RAS Extension Support Dec 16 12:16:17.374981 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Dec 16 12:16:17.374989 kernel: alternatives: applying system-wide alternatives Dec 16 12:16:17.374996 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Dec 16 12:16:17.375004 kernel: Memory: 16324432K/16777216K available (11200K kernel code, 2456K rwdata, 9084K rodata, 12480K init, 1038K bss, 430000K reserved, 16384K cma-reserved) Dec 16 12:16:17.375012 kernel: devtmpfs: initialized Dec 16 12:16:17.375021 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Dec 16 12:16:17.375028 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Dec 16 12:16:17.375036 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Dec 16 12:16:17.375043 kernel: 0 pages in range for non-PLT usage Dec 16 12:16:17.375051 kernel: 515168 pages in range for PLT usage Dec 16 12:16:17.375058 kernel: pinctrl core: initialized pinctrl subsystem Dec 16 12:16:17.375066 kernel: SMBIOS 3.0.0 present. Dec 16 12:16:17.375074 kernel: DMI: QEMU KVM Virtual Machine, BIOS 0.0.0 02/06/2015 Dec 16 12:16:17.375082 kernel: DMI: Memory slots populated: 1/1 Dec 16 12:16:17.375090 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Dec 16 12:16:17.375098 kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Dec 16 12:16:17.375105 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Dec 16 12:16:17.375113 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Dec 16 12:16:17.375120 kernel: audit: initializing netlink subsys (disabled) Dec 16 12:16:17.375128 kernel: audit: type=2000 audit(0.038:1): state=initialized audit_enabled=0 res=1 Dec 16 12:16:17.375137 kernel: thermal_sys: Registered thermal governor 'step_wise' Dec 16 12:16:17.375144 kernel: cpuidle: using governor menu Dec 16 12:16:17.375152 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Dec 16 12:16:17.375159 kernel: ASID allocator initialised with 32768 entries Dec 16 12:16:17.375167 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Dec 16 12:16:17.375174 kernel: Serial: AMBA PL011 UART driver Dec 16 12:16:17.375182 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Dec 16 12:16:17.375191 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Dec 16 12:16:17.375199 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Dec 16 12:16:17.375206 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Dec 16 12:16:17.375214 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Dec 16 12:16:17.375221 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Dec 16 12:16:17.375229 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Dec 16 12:16:17.375236 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Dec 16 12:16:17.375245 kernel: ACPI: Added _OSI(Module Device) Dec 16 12:16:17.375253 kernel: ACPI: Added _OSI(Processor Device) Dec 16 12:16:17.375261 kernel: ACPI: Added _OSI(Processor Aggregator Device) Dec 16 12:16:17.375268 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Dec 16 12:16:17.375275 kernel: ACPI: Interpreter enabled Dec 16 12:16:17.375283 kernel: ACPI: Using GIC for interrupt routing Dec 16 12:16:17.375290 kernel: ACPI: MCFG table detected, 1 entries Dec 16 12:16:17.375298 kernel: ACPI: CPU0 has been hot-added Dec 16 12:16:17.375306 kernel: ACPI: CPU1 has been hot-added Dec 16 12:16:17.375314 kernel: ACPI: CPU2 has been hot-added Dec 16 12:16:17.375321 kernel: ACPI: CPU3 has been hot-added Dec 16 12:16:17.375329 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Dec 16 12:16:17.375337 kernel: printk: legacy console [ttyAMA0] enabled Dec 16 12:16:17.375345 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Dec 16 12:16:17.375523 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Dec 16 12:16:17.375631 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Dec 16 12:16:17.375724 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Dec 16 12:16:17.375849 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Dec 16 12:16:17.375939 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Dec 16 12:16:17.375949 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Dec 16 12:16:17.375957 kernel: PCI host bridge to bus 0000:00 Dec 16 12:16:17.376051 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Dec 16 12:16:17.376126 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Dec 16 12:16:17.376199 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Dec 16 12:16:17.376272 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Dec 16 12:16:17.376376 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Dec 16 12:16:17.376470 kernel: pci 0000:00:01.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.376557 kernel: pci 0000:00:01.0: BAR 0 [mem 0x125a0000-0x125a0fff] Dec 16 12:16:17.376638 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Dec 16 12:16:17.376717 kernel: pci 0000:00:01.0: bridge window [mem 0x12400000-0x124fffff] Dec 16 12:16:17.376796 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Dec 16 12:16:17.376901 kernel: pci 0000:00:01.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.376986 kernel: pci 0000:00:01.1: BAR 0 [mem 0x1259f000-0x1259ffff] Dec 16 12:16:17.377066 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Dec 16 12:16:17.377147 kernel: pci 0000:00:01.1: bridge window [mem 0x12300000-0x123fffff] Dec 16 12:16:17.377235 kernel: pci 0000:00:01.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.377317 kernel: pci 0000:00:01.2: BAR 0 [mem 0x1259e000-0x1259efff] Dec 16 12:16:17.377411 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Dec 16 12:16:17.377493 kernel: pci 0000:00:01.2: bridge window [mem 0x12200000-0x122fffff] Dec 16 12:16:17.377592 kernel: pci 0000:00:01.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Dec 16 12:16:17.377686 kernel: pci 0000:00:01.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.377769 kernel: pci 0000:00:01.3: BAR 0 [mem 0x1259d000-0x1259dfff] Dec 16 12:16:17.377876 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Dec 16 12:16:17.377962 kernel: pci 0000:00:01.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Dec 16 12:16:17.378073 kernel: pci 0000:00:01.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.378170 kernel: pci 0000:00:01.4: BAR 0 [mem 0x1259c000-0x1259cfff] Dec 16 12:16:17.378257 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Dec 16 12:16:17.378340 kernel: pci 0000:00:01.4: bridge window [mem 0x12100000-0x121fffff] Dec 16 12:16:17.378419 kernel: pci 0000:00:01.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Dec 16 12:16:17.378510 kernel: pci 0000:00:01.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.378590 kernel: pci 0000:00:01.5: BAR 0 [mem 0x1259b000-0x1259bfff] Dec 16 12:16:17.378669 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Dec 16 12:16:17.378747 kernel: pci 0000:00:01.5: bridge window [mem 0x12000000-0x120fffff] Dec 16 12:16:17.378848 kernel: pci 0000:00:01.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Dec 16 12:16:17.378940 kernel: pci 0000:00:01.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.379024 kernel: pci 0000:00:01.6: BAR 0 [mem 0x1259a000-0x1259afff] Dec 16 12:16:17.379103 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Dec 16 12:16:17.379188 kernel: pci 0000:00:01.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.379267 kernel: pci 0000:00:01.7: BAR 0 [mem 0x12599000-0x12599fff] Dec 16 12:16:17.379347 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Dec 16 12:16:17.379464 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.379551 kernel: pci 0000:00:02.0: BAR 0 [mem 0x12598000-0x12598fff] Dec 16 12:16:17.379629 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Dec 16 12:16:17.379717 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.379796 kernel: pci 0000:00:02.1: BAR 0 [mem 0x12597000-0x12597fff] Dec 16 12:16:17.379890 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Dec 16 12:16:17.379982 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.380071 kernel: pci 0000:00:02.2: BAR 0 [mem 0x12596000-0x12596fff] Dec 16 12:16:17.380151 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Dec 16 12:16:17.380238 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.380318 kernel: pci 0000:00:02.3: BAR 0 [mem 0x12595000-0x12595fff] Dec 16 12:16:17.380401 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Dec 16 12:16:17.380488 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.380571 kernel: pci 0000:00:02.4: BAR 0 [mem 0x12594000-0x12594fff] Dec 16 12:16:17.380651 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Dec 16 12:16:17.380738 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.380834 kernel: pci 0000:00:02.5: BAR 0 [mem 0x12593000-0x12593fff] Dec 16 12:16:17.380937 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Dec 16 12:16:17.381028 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.381111 kernel: pci 0000:00:02.6: BAR 0 [mem 0x12592000-0x12592fff] Dec 16 12:16:17.381190 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Dec 16 12:16:17.381279 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.381373 kernel: pci 0000:00:02.7: BAR 0 [mem 0x12591000-0x12591fff] Dec 16 12:16:17.381457 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Dec 16 12:16:17.381547 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.381647 kernel: pci 0000:00:03.0: BAR 0 [mem 0x12590000-0x12590fff] Dec 16 12:16:17.381728 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Dec 16 12:16:17.381832 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.381921 kernel: pci 0000:00:03.1: BAR 0 [mem 0x1258f000-0x1258ffff] Dec 16 12:16:17.382012 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Dec 16 12:16:17.382117 kernel: pci 0000:00:03.1: bridge window [io 0xf000-0xffff] Dec 16 12:16:17.382201 kernel: pci 0000:00:03.1: bridge window [mem 0x11e00000-0x11ffffff] Dec 16 12:16:17.382287 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.382366 kernel: pci 0000:00:03.2: BAR 0 [mem 0x1258e000-0x1258efff] Dec 16 12:16:17.382449 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Dec 16 12:16:17.382528 kernel: pci 0000:00:03.2: bridge window [io 0xe000-0xefff] Dec 16 12:16:17.382606 kernel: pci 0000:00:03.2: bridge window [mem 0x11c00000-0x11dfffff] Dec 16 12:16:17.382698 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.382778 kernel: pci 0000:00:03.3: BAR 0 [mem 0x1258d000-0x1258dfff] Dec 16 12:16:17.382877 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Dec 16 12:16:17.382962 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff] Dec 16 12:16:17.383041 kernel: pci 0000:00:03.3: bridge window [mem 0x11a00000-0x11bfffff] Dec 16 12:16:17.383128 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.383212 kernel: pci 0000:00:03.4: BAR 0 [mem 0x1258c000-0x1258cfff] Dec 16 12:16:17.383291 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Dec 16 12:16:17.383370 kernel: pci 0000:00:03.4: bridge window [io 0xc000-0xcfff] Dec 16 12:16:17.383452 kernel: pci 0000:00:03.4: bridge window [mem 0x11800000-0x119fffff] Dec 16 12:16:17.383538 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.383617 kernel: pci 0000:00:03.5: BAR 0 [mem 0x1258b000-0x1258bfff] Dec 16 12:16:17.383697 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Dec 16 12:16:17.383776 kernel: pci 0000:00:03.5: bridge window [io 0xb000-0xbfff] Dec 16 12:16:17.383871 kernel: pci 0000:00:03.5: bridge window [mem 0x11600000-0x117fffff] Dec 16 12:16:17.383966 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.384048 kernel: pci 0000:00:03.6: BAR 0 [mem 0x1258a000-0x1258afff] Dec 16 12:16:17.384128 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Dec 16 12:16:17.384207 kernel: pci 0000:00:03.6: bridge window [io 0xa000-0xafff] Dec 16 12:16:17.384287 kernel: pci 0000:00:03.6: bridge window [mem 0x11400000-0x115fffff] Dec 16 12:16:17.384376 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.384459 kernel: pci 0000:00:03.7: BAR 0 [mem 0x12589000-0x12589fff] Dec 16 12:16:17.384540 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Dec 16 12:16:17.384619 kernel: pci 0000:00:03.7: bridge window [io 0x9000-0x9fff] Dec 16 12:16:17.384699 kernel: pci 0000:00:03.7: bridge window [mem 0x11200000-0x113fffff] Dec 16 12:16:17.384785 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.384885 kernel: pci 0000:00:04.0: BAR 0 [mem 0x12588000-0x12588fff] Dec 16 12:16:17.384972 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Dec 16 12:16:17.385055 kernel: pci 0000:00:04.0: bridge window [io 0x8000-0x8fff] Dec 16 12:16:17.385137 kernel: pci 0000:00:04.0: bridge window [mem 0x11000000-0x111fffff] Dec 16 12:16:17.385230 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.385310 kernel: pci 0000:00:04.1: BAR 0 [mem 0x12587000-0x12587fff] Dec 16 12:16:17.385391 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Dec 16 12:16:17.385468 kernel: pci 0000:00:04.1: bridge window [io 0x7000-0x7fff] Dec 16 12:16:17.385549 kernel: pci 0000:00:04.1: bridge window [mem 0x10e00000-0x10ffffff] Dec 16 12:16:17.385634 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.385714 kernel: pci 0000:00:04.2: BAR 0 [mem 0x12586000-0x12586fff] Dec 16 12:16:17.385794 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Dec 16 12:16:17.385891 kernel: pci 0000:00:04.2: bridge window [io 0x6000-0x6fff] Dec 16 12:16:17.385972 kernel: pci 0000:00:04.2: bridge window [mem 0x10c00000-0x10dfffff] Dec 16 12:16:17.386075 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.386165 kernel: pci 0000:00:04.3: BAR 0 [mem 0x12585000-0x12585fff] Dec 16 12:16:17.386246 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Dec 16 12:16:17.386341 kernel: pci 0000:00:04.3: bridge window [io 0x5000-0x5fff] Dec 16 12:16:17.386428 kernel: pci 0000:00:04.3: bridge window [mem 0x10a00000-0x10bfffff] Dec 16 12:16:17.386521 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.386604 kernel: pci 0000:00:04.4: BAR 0 [mem 0x12584000-0x12584fff] Dec 16 12:16:17.386685 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Dec 16 12:16:17.386768 kernel: pci 0000:00:04.4: bridge window [io 0x4000-0x4fff] Dec 16 12:16:17.386909 kernel: pci 0000:00:04.4: bridge window [mem 0x10800000-0x109fffff] Dec 16 12:16:17.387004 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.387088 kernel: pci 0000:00:04.5: BAR 0 [mem 0x12583000-0x12583fff] Dec 16 12:16:17.387168 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Dec 16 12:16:17.387246 kernel: pci 0000:00:04.5: bridge window [io 0x3000-0x3fff] Dec 16 12:16:17.387327 kernel: pci 0000:00:04.5: bridge window [mem 0x10600000-0x107fffff] Dec 16 12:16:17.387413 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.387493 kernel: pci 0000:00:04.6: BAR 0 [mem 0x12582000-0x12582fff] Dec 16 12:16:17.387571 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Dec 16 12:16:17.387649 kernel: pci 0000:00:04.6: bridge window [io 0x2000-0x2fff] Dec 16 12:16:17.387727 kernel: pci 0000:00:04.6: bridge window [mem 0x10400000-0x105fffff] Dec 16 12:16:17.387823 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.387906 kernel: pci 0000:00:04.7: BAR 0 [mem 0x12581000-0x12581fff] Dec 16 12:16:17.387994 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Dec 16 12:16:17.388078 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x1fff] Dec 16 12:16:17.388161 kernel: pci 0000:00:04.7: bridge window [mem 0x10200000-0x103fffff] Dec 16 12:16:17.388247 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:16:17.388328 kernel: pci 0000:00:05.0: BAR 0 [mem 0x12580000-0x12580fff] Dec 16 12:16:17.388406 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Dec 16 12:16:17.388485 kernel: pci 0000:00:05.0: bridge window [io 0x0000-0x0fff] Dec 16 12:16:17.388563 kernel: pci 0000:00:05.0: bridge window [mem 0x10000000-0x101fffff] Dec 16 12:16:17.388671 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Dec 16 12:16:17.388755 kernel: pci 0000:01:00.0: BAR 1 [mem 0x12400000-0x12400fff] Dec 16 12:16:17.388847 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Dec 16 12:16:17.388942 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Dec 16 12:16:17.389041 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Dec 16 12:16:17.389123 kernel: pci 0000:02:00.0: BAR 0 [mem 0x12300000-0x12303fff 64bit] Dec 16 12:16:17.389212 kernel: pci 0000:03:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Dec 16 12:16:17.389297 kernel: pci 0000:03:00.0: BAR 1 [mem 0x12200000-0x12200fff] Dec 16 12:16:17.389378 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Dec 16 12:16:17.389465 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Dec 16 12:16:17.389547 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Dec 16 12:16:17.389638 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Dec 16 12:16:17.389728 kernel: pci 0000:05:00.0: BAR 1 [mem 0x12100000-0x12100fff] Dec 16 12:16:17.389920 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Dec 16 12:16:17.390047 kernel: pci 0000:06:00.0: [1af4:1050] type 00 class 0x038000 PCIe Endpoint Dec 16 12:16:17.390161 kernel: pci 0000:06:00.0: BAR 1 [mem 0x12000000-0x12000fff] Dec 16 12:16:17.390244 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Dec 16 12:16:17.390330 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Dec 16 12:16:17.390419 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Dec 16 12:16:17.390498 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Dec 16 12:16:17.390582 kernel: pci 0000:00:01.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Dec 16 12:16:17.390662 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Dec 16 12:16:17.390762 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Dec 16 12:16:17.390883 kernel: pci 0000:00:01.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Dec 16 12:16:17.390973 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Dec 16 12:16:17.391054 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Dec 16 12:16:17.391143 kernel: pci 0000:00:01.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Dec 16 12:16:17.391230 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Dec 16 12:16:17.391313 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Dec 16 12:16:17.391396 kernel: pci 0000:00:01.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Dec 16 12:16:17.391477 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Dec 16 12:16:17.391555 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Dec 16 12:16:17.391638 kernel: pci 0000:00:01.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Dec 16 12:16:17.391721 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Dec 16 12:16:17.391800 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Dec 16 12:16:17.391904 kernel: pci 0000:00:01.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Dec 16 12:16:17.391985 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 07] add_size 200000 add_align 100000 Dec 16 12:16:17.392065 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff] to [bus 07] add_size 200000 add_align 100000 Dec 16 12:16:17.392149 kernel: pci 0000:00:01.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Dec 16 12:16:17.392231 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Dec 16 12:16:17.392310 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Dec 16 12:16:17.392393 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Dec 16 12:16:17.392473 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Dec 16 12:16:17.392551 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Dec 16 12:16:17.392637 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Dec 16 12:16:17.392717 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0a] add_size 200000 add_align 100000 Dec 16 12:16:17.392796 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff] to [bus 0a] add_size 200000 add_align 100000 Dec 16 12:16:17.392890 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Dec 16 12:16:17.392972 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Dec 16 12:16:17.393051 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff] to [bus 0b] add_size 200000 add_align 100000 Dec 16 12:16:17.393138 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Dec 16 12:16:17.393218 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0c] add_size 200000 add_align 100000 Dec 16 12:16:17.393296 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 0c] add_size 200000 add_align 100000 Dec 16 12:16:17.393380 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Dec 16 12:16:17.393473 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0d] add_size 200000 add_align 100000 Dec 16 12:16:17.393555 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 0d] add_size 200000 add_align 100000 Dec 16 12:16:17.393641 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Dec 16 12:16:17.393720 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0e] add_size 200000 add_align 100000 Dec 16 12:16:17.393799 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff] to [bus 0e] add_size 200000 add_align 100000 Dec 16 12:16:17.393893 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Dec 16 12:16:17.393974 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0f] add_size 200000 add_align 100000 Dec 16 12:16:17.394069 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff] to [bus 0f] add_size 200000 add_align 100000 Dec 16 12:16:17.394159 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Dec 16 12:16:17.394250 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 10] add_size 200000 add_align 100000 Dec 16 12:16:17.394330 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 10] add_size 200000 add_align 100000 Dec 16 12:16:17.394417 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Dec 16 12:16:17.394498 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 11] add_size 200000 add_align 100000 Dec 16 12:16:17.394579 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 11] add_size 200000 add_align 100000 Dec 16 12:16:17.394663 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Dec 16 12:16:17.394744 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 12] add_size 200000 add_align 100000 Dec 16 12:16:17.394853 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff] to [bus 12] add_size 200000 add_align 100000 Dec 16 12:16:17.394944 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Dec 16 12:16:17.395029 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 13] add_size 200000 add_align 100000 Dec 16 12:16:17.395120 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff] to [bus 13] add_size 200000 add_align 100000 Dec 16 12:16:17.395220 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Dec 16 12:16:17.395304 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 14] add_size 200000 add_align 100000 Dec 16 12:16:17.395383 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff] to [bus 14] add_size 200000 add_align 100000 Dec 16 12:16:17.395466 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Dec 16 12:16:17.395558 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 15] add_size 200000 add_align 100000 Dec 16 12:16:17.395640 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff] to [bus 15] add_size 200000 add_align 100000 Dec 16 12:16:17.395724 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Dec 16 12:16:17.395805 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 16] add_size 200000 add_align 100000 Dec 16 12:16:17.395930 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff] to [bus 16] add_size 200000 add_align 100000 Dec 16 12:16:17.396021 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Dec 16 12:16:17.396102 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 17] add_size 200000 add_align 100000 Dec 16 12:16:17.396190 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff] to [bus 17] add_size 200000 add_align 100000 Dec 16 12:16:17.396279 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Dec 16 12:16:17.396359 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 18] add_size 200000 add_align 100000 Dec 16 12:16:17.396439 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff] to [bus 18] add_size 200000 add_align 100000 Dec 16 12:16:17.396525 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Dec 16 12:16:17.396606 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 19] add_size 200000 add_align 100000 Dec 16 12:16:17.396685 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 19] add_size 200000 add_align 100000 Dec 16 12:16:17.396770 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Dec 16 12:16:17.396865 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1a] add_size 200000 add_align 100000 Dec 16 12:16:17.396947 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff] to [bus 1a] add_size 200000 add_align 100000 Dec 16 12:16:17.397034 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Dec 16 12:16:17.397116 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1b] add_size 200000 add_align 100000 Dec 16 12:16:17.397195 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff] to [bus 1b] add_size 200000 add_align 100000 Dec 16 12:16:17.397278 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Dec 16 12:16:17.397358 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1c] add_size 200000 add_align 100000 Dec 16 12:16:17.397439 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff] to [bus 1c] add_size 200000 add_align 100000 Dec 16 12:16:17.397521 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Dec 16 12:16:17.397601 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1d] add_size 200000 add_align 100000 Dec 16 12:16:17.397680 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff] to [bus 1d] add_size 200000 add_align 100000 Dec 16 12:16:17.397762 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Dec 16 12:16:17.397876 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1e] add_size 200000 add_align 100000 Dec 16 12:16:17.397963 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff] to [bus 1e] add_size 200000 add_align 100000 Dec 16 12:16:17.398047 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Dec 16 12:16:17.398152 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1f] add_size 200000 add_align 100000 Dec 16 12:16:17.398242 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff] to [bus 1f] add_size 200000 add_align 100000 Dec 16 12:16:17.398328 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Dec 16 12:16:17.398419 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 20] add_size 200000 add_align 100000 Dec 16 12:16:17.398513 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff] to [bus 20] add_size 200000 add_align 100000 Dec 16 12:16:17.398602 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Dec 16 12:16:17.398682 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 21] add_size 200000 add_align 100000 Dec 16 12:16:17.398762 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff] to [bus 21] add_size 200000 add_align 100000 Dec 16 12:16:17.398873 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff]: assigned Dec 16 12:16:17.398975 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Dec 16 12:16:17.399075 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff]: assigned Dec 16 12:16:17.399160 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Dec 16 12:16:17.399251 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff]: assigned Dec 16 12:16:17.399332 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Dec 16 12:16:17.399415 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff]: assigned Dec 16 12:16:17.399498 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Dec 16 12:16:17.399593 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff]: assigned Dec 16 12:16:17.399675 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Dec 16 12:16:17.399757 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Dec 16 12:16:17.399851 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Dec 16 12:16:17.399935 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Dec 16 12:16:17.400015 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Dec 16 12:16:17.400102 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Dec 16 12:16:17.400181 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Dec 16 12:16:17.400276 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff]: assigned Dec 16 12:16:17.400373 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Dec 16 12:16:17.400459 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff]: assigned Dec 16 12:16:17.400543 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref]: assigned Dec 16 12:16:17.400628 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff]: assigned Dec 16 12:16:17.400708 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref]: assigned Dec 16 12:16:17.400790 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff]: assigned Dec 16 12:16:17.400902 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref]: assigned Dec 16 12:16:17.400988 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff]: assigned Dec 16 12:16:17.401070 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref]: assigned Dec 16 12:16:17.401153 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff]: assigned Dec 16 12:16:17.401245 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref]: assigned Dec 16 12:16:17.401330 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff]: assigned Dec 16 12:16:17.401410 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref]: assigned Dec 16 12:16:17.401492 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff]: assigned Dec 16 12:16:17.401576 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref]: assigned Dec 16 12:16:17.401660 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff]: assigned Dec 16 12:16:17.401742 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref]: assigned Dec 16 12:16:17.401843 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff]: assigned Dec 16 12:16:17.401927 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref]: assigned Dec 16 12:16:17.402013 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff]: assigned Dec 16 12:16:17.402113 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref]: assigned Dec 16 12:16:17.402200 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff]: assigned Dec 16 12:16:17.402286 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref]: assigned Dec 16 12:16:17.402370 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff]: assigned Dec 16 12:16:17.402451 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref]: assigned Dec 16 12:16:17.402534 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff]: assigned Dec 16 12:16:17.402614 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref]: assigned Dec 16 12:16:17.402696 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff]: assigned Dec 16 12:16:17.402776 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref]: assigned Dec 16 12:16:17.402876 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff]: assigned Dec 16 12:16:17.402964 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref]: assigned Dec 16 12:16:17.403049 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff]: assigned Dec 16 12:16:17.403132 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref]: assigned Dec 16 12:16:17.403247 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff]: assigned Dec 16 12:16:17.403329 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref]: assigned Dec 16 12:16:17.403420 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff]: assigned Dec 16 12:16:17.403500 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref]: assigned Dec 16 12:16:17.403581 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff]: assigned Dec 16 12:16:17.403660 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref]: assigned Dec 16 12:16:17.403741 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff]: assigned Dec 16 12:16:17.403838 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref]: assigned Dec 16 12:16:17.403930 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff]: assigned Dec 16 12:16:17.404029 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref]: assigned Dec 16 12:16:17.404118 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff]: assigned Dec 16 12:16:17.404197 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref]: assigned Dec 16 12:16:17.404286 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff]: assigned Dec 16 12:16:17.404367 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref]: assigned Dec 16 12:16:17.404448 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff]: assigned Dec 16 12:16:17.404533 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref]: assigned Dec 16 12:16:17.404616 kernel: pci 0000:00:01.0: BAR 0 [mem 0x14200000-0x14200fff]: assigned Dec 16 12:16:17.404697 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x1fff]: assigned Dec 16 12:16:17.404779 kernel: pci 0000:00:01.1: BAR 0 [mem 0x14201000-0x14201fff]: assigned Dec 16 12:16:17.404879 kernel: pci 0000:00:01.1: bridge window [io 0x2000-0x2fff]: assigned Dec 16 12:16:17.404963 kernel: pci 0000:00:01.2: BAR 0 [mem 0x14202000-0x14202fff]: assigned Dec 16 12:16:17.405048 kernel: pci 0000:00:01.2: bridge window [io 0x3000-0x3fff]: assigned Dec 16 12:16:17.405130 kernel: pci 0000:00:01.3: BAR 0 [mem 0x14203000-0x14203fff]: assigned Dec 16 12:16:17.405210 kernel: pci 0000:00:01.3: bridge window [io 0x4000-0x4fff]: assigned Dec 16 12:16:17.405292 kernel: pci 0000:00:01.4: BAR 0 [mem 0x14204000-0x14204fff]: assigned Dec 16 12:16:17.405374 kernel: pci 0000:00:01.4: bridge window [io 0x5000-0x5fff]: assigned Dec 16 12:16:17.405457 kernel: pci 0000:00:01.5: BAR 0 [mem 0x14205000-0x14205fff]: assigned Dec 16 12:16:17.405541 kernel: pci 0000:00:01.5: bridge window [io 0x6000-0x6fff]: assigned Dec 16 12:16:17.405623 kernel: pci 0000:00:01.6: BAR 0 [mem 0x14206000-0x14206fff]: assigned Dec 16 12:16:17.405706 kernel: pci 0000:00:01.6: bridge window [io 0x7000-0x7fff]: assigned Dec 16 12:16:17.405794 kernel: pci 0000:00:01.7: BAR 0 [mem 0x14207000-0x14207fff]: assigned Dec 16 12:16:17.405900 kernel: pci 0000:00:01.7: bridge window [io 0x8000-0x8fff]: assigned Dec 16 12:16:17.405998 kernel: pci 0000:00:02.0: BAR 0 [mem 0x14208000-0x14208fff]: assigned Dec 16 12:16:17.406125 kernel: pci 0000:00:02.0: bridge window [io 0x9000-0x9fff]: assigned Dec 16 12:16:17.406225 kernel: pci 0000:00:02.1: BAR 0 [mem 0x14209000-0x14209fff]: assigned Dec 16 12:16:17.406313 kernel: pci 0000:00:02.1: bridge window [io 0xa000-0xafff]: assigned Dec 16 12:16:17.406398 kernel: pci 0000:00:02.2: BAR 0 [mem 0x1420a000-0x1420afff]: assigned Dec 16 12:16:17.406481 kernel: pci 0000:00:02.2: bridge window [io 0xb000-0xbfff]: assigned Dec 16 12:16:17.406571 kernel: pci 0000:00:02.3: BAR 0 [mem 0x1420b000-0x1420bfff]: assigned Dec 16 12:16:17.406652 kernel: pci 0000:00:02.3: bridge window [io 0xc000-0xcfff]: assigned Dec 16 12:16:17.406738 kernel: pci 0000:00:02.4: BAR 0 [mem 0x1420c000-0x1420cfff]: assigned Dec 16 12:16:17.406839 kernel: pci 0000:00:02.4: bridge window [io 0xd000-0xdfff]: assigned Dec 16 12:16:17.406928 kernel: pci 0000:00:02.5: BAR 0 [mem 0x1420d000-0x1420dfff]: assigned Dec 16 12:16:17.407009 kernel: pci 0000:00:02.5: bridge window [io 0xe000-0xefff]: assigned Dec 16 12:16:17.407093 kernel: pci 0000:00:02.6: BAR 0 [mem 0x1420e000-0x1420efff]: assigned Dec 16 12:16:17.407187 kernel: pci 0000:00:02.6: bridge window [io 0xf000-0xffff]: assigned Dec 16 12:16:17.407273 kernel: pci 0000:00:02.7: BAR 0 [mem 0x1420f000-0x1420ffff]: assigned Dec 16 12:16:17.407358 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.407439 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.407522 kernel: pci 0000:00:03.0: BAR 0 [mem 0x14210000-0x14210fff]: assigned Dec 16 12:16:17.407604 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.407686 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.407771 kernel: pci 0000:00:03.1: BAR 0 [mem 0x14211000-0x14211fff]: assigned Dec 16 12:16:17.407898 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.407991 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.408080 kernel: pci 0000:00:03.2: BAR 0 [mem 0x14212000-0x14212fff]: assigned Dec 16 12:16:17.408165 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.408252 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.408338 kernel: pci 0000:00:03.3: BAR 0 [mem 0x14213000-0x14213fff]: assigned Dec 16 12:16:17.408439 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.408520 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.408605 kernel: pci 0000:00:03.4: BAR 0 [mem 0x14214000-0x14214fff]: assigned Dec 16 12:16:17.408686 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.408768 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.408871 kernel: pci 0000:00:03.5: BAR 0 [mem 0x14215000-0x14215fff]: assigned Dec 16 12:16:17.408960 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.409040 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.409123 kernel: pci 0000:00:03.6: BAR 0 [mem 0x14216000-0x14216fff]: assigned Dec 16 12:16:17.409204 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.409292 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.409382 kernel: pci 0000:00:03.7: BAR 0 [mem 0x14217000-0x14217fff]: assigned Dec 16 12:16:17.409466 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.409545 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.409627 kernel: pci 0000:00:04.0: BAR 0 [mem 0x14218000-0x14218fff]: assigned Dec 16 12:16:17.409707 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.409786 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.409912 kernel: pci 0000:00:04.1: BAR 0 [mem 0x14219000-0x14219fff]: assigned Dec 16 12:16:17.409995 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.410094 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.410183 kernel: pci 0000:00:04.2: BAR 0 [mem 0x1421a000-0x1421afff]: assigned Dec 16 12:16:17.410263 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.410343 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.410424 kernel: pci 0000:00:04.3: BAR 0 [mem 0x1421b000-0x1421bfff]: assigned Dec 16 12:16:17.410506 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.410585 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.410666 kernel: pci 0000:00:04.4: BAR 0 [mem 0x1421c000-0x1421cfff]: assigned Dec 16 12:16:17.410745 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.410842 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.410930 kernel: pci 0000:00:04.5: BAR 0 [mem 0x1421d000-0x1421dfff]: assigned Dec 16 12:16:17.411015 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.411095 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.411179 kernel: pci 0000:00:04.6: BAR 0 [mem 0x1421e000-0x1421efff]: assigned Dec 16 12:16:17.411262 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.411342 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.411424 kernel: pci 0000:00:04.7: BAR 0 [mem 0x1421f000-0x1421ffff]: assigned Dec 16 12:16:17.411503 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.411584 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.411666 kernel: pci 0000:00:05.0: BAR 0 [mem 0x14220000-0x14220fff]: assigned Dec 16 12:16:17.411745 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.411837 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.411922 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff]: assigned Dec 16 12:16:17.412002 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff]: assigned Dec 16 12:16:17.412084 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff]: assigned Dec 16 12:16:17.412164 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff]: assigned Dec 16 12:16:17.412243 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff]: assigned Dec 16 12:16:17.412324 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff]: assigned Dec 16 12:16:17.412404 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff]: assigned Dec 16 12:16:17.412484 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff]: assigned Dec 16 12:16:17.412563 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff]: assigned Dec 16 12:16:17.412646 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff]: assigned Dec 16 12:16:17.412726 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff]: assigned Dec 16 12:16:17.412814 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff]: assigned Dec 16 12:16:17.412904 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff]: assigned Dec 16 12:16:17.412989 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff]: assigned Dec 16 12:16:17.413071 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff]: assigned Dec 16 12:16:17.413152 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.413233 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.413314 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.413395 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.413475 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.413555 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.413635 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.413714 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.413798 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.413895 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.413978 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.414075 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.414197 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.414285 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.414369 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.414456 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.414539 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.414618 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.414699 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.414782 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.414901 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.414993 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.415076 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.415159 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.415253 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.415335 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.415420 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.415509 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.415606 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.415691 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.415773 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.415896 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.415987 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.416070 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.416153 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:16:17.416233 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: failed to assign Dec 16 12:16:17.416320 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Dec 16 12:16:17.416404 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Dec 16 12:16:17.416487 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Dec 16 12:16:17.416567 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Dec 16 12:16:17.416648 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff] Dec 16 12:16:17.416727 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Dec 16 12:16:17.416841 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Dec 16 12:16:17.416928 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Dec 16 12:16:17.417013 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff] Dec 16 12:16:17.417095 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Dec 16 12:16:17.417182 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Dec 16 12:16:17.417273 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Dec 16 12:16:17.417357 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Dec 16 12:16:17.417437 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff] Dec 16 12:16:17.417526 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Dec 16 12:16:17.417618 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Dec 16 12:16:17.417698 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Dec 16 12:16:17.417777 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff] Dec 16 12:16:17.418966 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Dec 16 12:16:17.419080 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Dec 16 12:16:17.419165 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Dec 16 12:16:17.419247 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Dec 16 12:16:17.419329 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff] Dec 16 12:16:17.419411 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Dec 16 12:16:17.419499 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Dec 16 12:16:17.419588 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Dec 16 12:16:17.419678 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Dec 16 12:16:17.419764 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff] Dec 16 12:16:17.419875 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Dec 16 12:16:17.419972 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Dec 16 12:16:17.420062 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff] Dec 16 12:16:17.420144 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Dec 16 12:16:17.420233 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Dec 16 12:16:17.420314 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff] Dec 16 12:16:17.420393 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Dec 16 12:16:17.420476 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Dec 16 12:16:17.420556 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Dec 16 12:16:17.420639 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Dec 16 12:16:17.420722 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Dec 16 12:16:17.420806 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff] Dec 16 12:16:17.420905 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref] Dec 16 12:16:17.420990 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Dec 16 12:16:17.421091 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff] Dec 16 12:16:17.421183 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref] Dec 16 12:16:17.421272 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Dec 16 12:16:17.421356 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff] Dec 16 12:16:17.421436 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref] Dec 16 12:16:17.421518 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Dec 16 12:16:17.421600 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff] Dec 16 12:16:17.421680 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref] Dec 16 12:16:17.421763 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Dec 16 12:16:17.421886 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff] Dec 16 12:16:17.421969 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref] Dec 16 12:16:17.422069 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Dec 16 12:16:17.422159 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff] Dec 16 12:16:17.422240 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref] Dec 16 12:16:17.422323 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Dec 16 12:16:17.422405 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff] Dec 16 12:16:17.422487 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref] Dec 16 12:16:17.422575 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Dec 16 12:16:17.422657 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff] Dec 16 12:16:17.422737 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref] Dec 16 12:16:17.422838 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Dec 16 12:16:17.422926 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff] Dec 16 12:16:17.423007 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref] Dec 16 12:16:17.423094 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Dec 16 12:16:17.423176 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff] Dec 16 12:16:17.423258 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff] Dec 16 12:16:17.423339 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref] Dec 16 12:16:17.423433 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Dec 16 12:16:17.423515 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff] Dec 16 12:16:17.423597 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff] Dec 16 12:16:17.423677 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref] Dec 16 12:16:17.423760 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Dec 16 12:16:17.423854 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff] Dec 16 12:16:17.423941 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff] Dec 16 12:16:17.424023 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref] Dec 16 12:16:17.424111 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Dec 16 12:16:17.424198 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff] Dec 16 12:16:17.424283 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff] Dec 16 12:16:17.424363 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref] Dec 16 12:16:17.424445 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Dec 16 12:16:17.424526 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff] Dec 16 12:16:17.424604 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff] Dec 16 12:16:17.424684 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref] Dec 16 12:16:17.424770 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Dec 16 12:16:17.424938 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff] Dec 16 12:16:17.425028 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff] Dec 16 12:16:17.425109 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref] Dec 16 12:16:17.425196 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Dec 16 12:16:17.425281 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff] Dec 16 12:16:17.425367 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff] Dec 16 12:16:17.425447 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref] Dec 16 12:16:17.425535 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Dec 16 12:16:17.425618 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff] Dec 16 12:16:17.425699 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff] Dec 16 12:16:17.425778 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref] Dec 16 12:16:17.425877 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Dec 16 12:16:17.425964 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff] Dec 16 12:16:17.426093 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff] Dec 16 12:16:17.426185 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref] Dec 16 12:16:17.426272 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Dec 16 12:16:17.426354 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff] Dec 16 12:16:17.426435 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff] Dec 16 12:16:17.426527 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref] Dec 16 12:16:17.426621 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Dec 16 12:16:17.426705 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff] Dec 16 12:16:17.426784 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff] Dec 16 12:16:17.426884 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref] Dec 16 12:16:17.426972 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Dec 16 12:16:17.427052 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff] Dec 16 12:16:17.427135 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff] Dec 16 12:16:17.427214 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref] Dec 16 12:16:17.427297 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Dec 16 12:16:17.427378 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff] Dec 16 12:16:17.427458 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff] Dec 16 12:16:17.427536 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref] Dec 16 12:16:17.427621 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Dec 16 12:16:17.427715 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff] Dec 16 12:16:17.427801 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff] Dec 16 12:16:17.427902 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref] Dec 16 12:16:17.427986 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Dec 16 12:16:17.428068 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff] Dec 16 12:16:17.428148 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff] Dec 16 12:16:17.428227 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref] Dec 16 12:16:17.428312 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Dec 16 12:16:17.428385 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Dec 16 12:16:17.428458 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Dec 16 12:16:17.428544 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Dec 16 12:16:17.428619 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Dec 16 12:16:17.428705 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Dec 16 12:16:17.428780 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Dec 16 12:16:17.428891 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Dec 16 12:16:17.428970 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Dec 16 12:16:17.429053 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Dec 16 12:16:17.429133 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Dec 16 12:16:17.429225 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Dec 16 12:16:17.429300 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Dec 16 12:16:17.429382 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Dec 16 12:16:17.429460 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Dec 16 12:16:17.429545 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Dec 16 12:16:17.429621 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Dec 16 12:16:17.429704 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Dec 16 12:16:17.429780 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Dec 16 12:16:17.429885 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Dec 16 12:16:17.429963 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Dec 16 12:16:17.430062 kernel: pci_bus 0000:0a: resource 1 [mem 0x11200000-0x113fffff] Dec 16 12:16:17.430153 kernel: pci_bus 0000:0a: resource 2 [mem 0x8001200000-0x80013fffff 64bit pref] Dec 16 12:16:17.430258 kernel: pci_bus 0000:0b: resource 1 [mem 0x11400000-0x115fffff] Dec 16 12:16:17.430345 kernel: pci_bus 0000:0b: resource 2 [mem 0x8001400000-0x80015fffff 64bit pref] Dec 16 12:16:17.430430 kernel: pci_bus 0000:0c: resource 1 [mem 0x11600000-0x117fffff] Dec 16 12:16:17.430514 kernel: pci_bus 0000:0c: resource 2 [mem 0x8001600000-0x80017fffff 64bit pref] Dec 16 12:16:17.430601 kernel: pci_bus 0000:0d: resource 1 [mem 0x11800000-0x119fffff] Dec 16 12:16:17.430677 kernel: pci_bus 0000:0d: resource 2 [mem 0x8001800000-0x80019fffff 64bit pref] Dec 16 12:16:17.430759 kernel: pci_bus 0000:0e: resource 1 [mem 0x11a00000-0x11bfffff] Dec 16 12:16:17.430855 kernel: pci_bus 0000:0e: resource 2 [mem 0x8001a00000-0x8001bfffff 64bit pref] Dec 16 12:16:17.430944 kernel: pci_bus 0000:0f: resource 1 [mem 0x11c00000-0x11dfffff] Dec 16 12:16:17.431033 kernel: pci_bus 0000:0f: resource 2 [mem 0x8001c00000-0x8001dfffff 64bit pref] Dec 16 12:16:17.431120 kernel: pci_bus 0000:10: resource 1 [mem 0x11e00000-0x11ffffff] Dec 16 12:16:17.431199 kernel: pci_bus 0000:10: resource 2 [mem 0x8001e00000-0x8001ffffff 64bit pref] Dec 16 12:16:17.431283 kernel: pci_bus 0000:11: resource 1 [mem 0x12000000-0x121fffff] Dec 16 12:16:17.431369 kernel: pci_bus 0000:11: resource 2 [mem 0x8002000000-0x80021fffff 64bit pref] Dec 16 12:16:17.431452 kernel: pci_bus 0000:12: resource 1 [mem 0x12200000-0x123fffff] Dec 16 12:16:17.431528 kernel: pci_bus 0000:12: resource 2 [mem 0x8002200000-0x80023fffff 64bit pref] Dec 16 12:16:17.431611 kernel: pci_bus 0000:13: resource 0 [io 0xf000-0xffff] Dec 16 12:16:17.431686 kernel: pci_bus 0000:13: resource 1 [mem 0x12400000-0x125fffff] Dec 16 12:16:17.431767 kernel: pci_bus 0000:13: resource 2 [mem 0x8002400000-0x80025fffff 64bit pref] Dec 16 12:16:17.431889 kernel: pci_bus 0000:14: resource 0 [io 0xe000-0xefff] Dec 16 12:16:17.431971 kernel: pci_bus 0000:14: resource 1 [mem 0x12600000-0x127fffff] Dec 16 12:16:17.432046 kernel: pci_bus 0000:14: resource 2 [mem 0x8002600000-0x80027fffff 64bit pref] Dec 16 12:16:17.432133 kernel: pci_bus 0000:15: resource 0 [io 0xd000-0xdfff] Dec 16 12:16:17.432209 kernel: pci_bus 0000:15: resource 1 [mem 0x12800000-0x129fffff] Dec 16 12:16:17.432300 kernel: pci_bus 0000:15: resource 2 [mem 0x8002800000-0x80029fffff 64bit pref] Dec 16 12:16:17.432389 kernel: pci_bus 0000:16: resource 0 [io 0xc000-0xcfff] Dec 16 12:16:17.432474 kernel: pci_bus 0000:16: resource 1 [mem 0x12a00000-0x12bfffff] Dec 16 12:16:17.432553 kernel: pci_bus 0000:16: resource 2 [mem 0x8002a00000-0x8002bfffff 64bit pref] Dec 16 12:16:17.432638 kernel: pci_bus 0000:17: resource 0 [io 0xb000-0xbfff] Dec 16 12:16:17.432720 kernel: pci_bus 0000:17: resource 1 [mem 0x12c00000-0x12dfffff] Dec 16 12:16:17.432795 kernel: pci_bus 0000:17: resource 2 [mem 0x8002c00000-0x8002dfffff 64bit pref] Dec 16 12:16:17.432895 kernel: pci_bus 0000:18: resource 0 [io 0xa000-0xafff] Dec 16 12:16:17.432980 kernel: pci_bus 0000:18: resource 1 [mem 0x12e00000-0x12ffffff] Dec 16 12:16:17.433061 kernel: pci_bus 0000:18: resource 2 [mem 0x8002e00000-0x8002ffffff 64bit pref] Dec 16 12:16:17.433145 kernel: pci_bus 0000:19: resource 0 [io 0x9000-0x9fff] Dec 16 12:16:17.433227 kernel: pci_bus 0000:19: resource 1 [mem 0x13000000-0x131fffff] Dec 16 12:16:17.433303 kernel: pci_bus 0000:19: resource 2 [mem 0x8003000000-0x80031fffff 64bit pref] Dec 16 12:16:17.433385 kernel: pci_bus 0000:1a: resource 0 [io 0x8000-0x8fff] Dec 16 12:16:17.433460 kernel: pci_bus 0000:1a: resource 1 [mem 0x13200000-0x133fffff] Dec 16 12:16:17.433534 kernel: pci_bus 0000:1a: resource 2 [mem 0x8003200000-0x80033fffff 64bit pref] Dec 16 12:16:17.433616 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Dec 16 12:16:17.433695 kernel: pci_bus 0000:1b: resource 1 [mem 0x13400000-0x135fffff] Dec 16 12:16:17.433770 kernel: pci_bus 0000:1b: resource 2 [mem 0x8003400000-0x80035fffff 64bit pref] Dec 16 12:16:17.433863 kernel: pci_bus 0000:1c: resource 0 [io 0x6000-0x6fff] Dec 16 12:16:17.433941 kernel: pci_bus 0000:1c: resource 1 [mem 0x13600000-0x137fffff] Dec 16 12:16:17.434017 kernel: pci_bus 0000:1c: resource 2 [mem 0x8003600000-0x80037fffff 64bit pref] Dec 16 12:16:17.434120 kernel: pci_bus 0000:1d: resource 0 [io 0x5000-0x5fff] Dec 16 12:16:17.434197 kernel: pci_bus 0000:1d: resource 1 [mem 0x13800000-0x139fffff] Dec 16 12:16:17.434271 kernel: pci_bus 0000:1d: resource 2 [mem 0x8003800000-0x80039fffff 64bit pref] Dec 16 12:16:17.434353 kernel: pci_bus 0000:1e: resource 0 [io 0x4000-0x4fff] Dec 16 12:16:17.434428 kernel: pci_bus 0000:1e: resource 1 [mem 0x13a00000-0x13bfffff] Dec 16 12:16:17.434501 kernel: pci_bus 0000:1e: resource 2 [mem 0x8003a00000-0x8003bfffff 64bit pref] Dec 16 12:16:17.434590 kernel: pci_bus 0000:1f: resource 0 [io 0x3000-0x3fff] Dec 16 12:16:17.434665 kernel: pci_bus 0000:1f: resource 1 [mem 0x13c00000-0x13dfffff] Dec 16 12:16:17.434739 kernel: pci_bus 0000:1f: resource 2 [mem 0x8003c00000-0x8003dfffff 64bit pref] Dec 16 12:16:17.434853 kernel: pci_bus 0000:20: resource 0 [io 0x2000-0x2fff] Dec 16 12:16:17.434938 kernel: pci_bus 0000:20: resource 1 [mem 0x13e00000-0x13ffffff] Dec 16 12:16:17.435019 kernel: pci_bus 0000:20: resource 2 [mem 0x8003e00000-0x8003ffffff 64bit pref] Dec 16 12:16:17.435102 kernel: pci_bus 0000:21: resource 0 [io 0x1000-0x1fff] Dec 16 12:16:17.435176 kernel: pci_bus 0000:21: resource 1 [mem 0x14000000-0x141fffff] Dec 16 12:16:17.435250 kernel: pci_bus 0000:21: resource 2 [mem 0x8004000000-0x80041fffff 64bit pref] Dec 16 12:16:17.435261 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Dec 16 12:16:17.435269 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Dec 16 12:16:17.435277 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Dec 16 12:16:17.435287 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Dec 16 12:16:17.435295 kernel: iommu: Default domain type: Translated Dec 16 12:16:17.435303 kernel: iommu: DMA domain TLB invalidation policy: strict mode Dec 16 12:16:17.435312 kernel: efivars: Registered efivars operations Dec 16 12:16:17.435319 kernel: vgaarb: loaded Dec 16 12:16:17.435327 kernel: clocksource: Switched to clocksource arch_sys_counter Dec 16 12:16:17.435336 kernel: VFS: Disk quotas dquot_6.6.0 Dec 16 12:16:17.435345 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Dec 16 12:16:17.435353 kernel: pnp: PnP ACPI init Dec 16 12:16:17.435445 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Dec 16 12:16:17.435456 kernel: pnp: PnP ACPI: found 1 devices Dec 16 12:16:17.435464 kernel: NET: Registered PF_INET protocol family Dec 16 12:16:17.435472 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Dec 16 12:16:17.435482 kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Dec 16 12:16:17.435490 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Dec 16 12:16:17.435498 kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Dec 16 12:16:17.435506 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Dec 16 12:16:17.435515 kernel: TCP: Hash tables configured (established 131072 bind 65536) Dec 16 12:16:17.435523 kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Dec 16 12:16:17.435532 kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Dec 16 12:16:17.435542 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Dec 16 12:16:17.435629 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Dec 16 12:16:17.435641 kernel: PCI: CLS 0 bytes, default 64 Dec 16 12:16:17.435649 kernel: kvm [1]: HYP mode not available Dec 16 12:16:17.435657 kernel: Initialise system trusted keyrings Dec 16 12:16:17.435665 kernel: workingset: timestamp_bits=39 max_order=22 bucket_order=0 Dec 16 12:16:17.435673 kernel: Key type asymmetric registered Dec 16 12:16:17.435682 kernel: Asymmetric key parser 'x509' registered Dec 16 12:16:17.435690 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Dec 16 12:16:17.435698 kernel: io scheduler mq-deadline registered Dec 16 12:16:17.435706 kernel: io scheduler kyber registered Dec 16 12:16:17.435714 kernel: io scheduler bfq registered Dec 16 12:16:17.435723 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Dec 16 12:16:17.435805 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 50 Dec 16 12:16:17.435916 kernel: pcieport 0000:00:01.0: AER: enabled with IRQ 50 Dec 16 12:16:17.436002 kernel: pcieport 0000:00:01.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.436085 kernel: pcieport 0000:00:01.1: PME: Signaling with IRQ 51 Dec 16 12:16:17.436165 kernel: pcieport 0000:00:01.1: AER: enabled with IRQ 51 Dec 16 12:16:17.436244 kernel: pcieport 0000:00:01.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.436327 kernel: pcieport 0000:00:01.2: PME: Signaling with IRQ 52 Dec 16 12:16:17.436406 kernel: pcieport 0000:00:01.2: AER: enabled with IRQ 52 Dec 16 12:16:17.436487 kernel: pcieport 0000:00:01.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.436569 kernel: pcieport 0000:00:01.3: PME: Signaling with IRQ 53 Dec 16 12:16:17.436648 kernel: pcieport 0000:00:01.3: AER: enabled with IRQ 53 Dec 16 12:16:17.436727 kernel: pcieport 0000:00:01.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.436819 kernel: pcieport 0000:00:01.4: PME: Signaling with IRQ 54 Dec 16 12:16:17.436905 kernel: pcieport 0000:00:01.4: AER: enabled with IRQ 54 Dec 16 12:16:17.436988 kernel: pcieport 0000:00:01.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.437072 kernel: pcieport 0000:00:01.5: PME: Signaling with IRQ 55 Dec 16 12:16:17.437155 kernel: pcieport 0000:00:01.5: AER: enabled with IRQ 55 Dec 16 12:16:17.437234 kernel: pcieport 0000:00:01.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.437319 kernel: pcieport 0000:00:01.6: PME: Signaling with IRQ 56 Dec 16 12:16:17.437401 kernel: pcieport 0000:00:01.6: AER: enabled with IRQ 56 Dec 16 12:16:17.437484 kernel: pcieport 0000:00:01.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.437570 kernel: pcieport 0000:00:01.7: PME: Signaling with IRQ 57 Dec 16 12:16:17.437652 kernel: pcieport 0000:00:01.7: AER: enabled with IRQ 57 Dec 16 12:16:17.437731 kernel: pcieport 0000:00:01.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.437742 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Dec 16 12:16:17.437834 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 58 Dec 16 12:16:17.437919 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 58 Dec 16 12:16:17.438002 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.438107 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 59 Dec 16 12:16:17.438193 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 59 Dec 16 12:16:17.438273 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.438356 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 60 Dec 16 12:16:17.438437 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 60 Dec 16 12:16:17.438520 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.438602 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 61 Dec 16 12:16:17.438686 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 61 Dec 16 12:16:17.438767 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.438868 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 62 Dec 16 12:16:17.438953 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 62 Dec 16 12:16:17.439033 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.439120 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 63 Dec 16 12:16:17.439200 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 63 Dec 16 12:16:17.439297 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.439384 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 64 Dec 16 12:16:17.439464 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 64 Dec 16 12:16:17.439542 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.439627 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 65 Dec 16 12:16:17.439707 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 65 Dec 16 12:16:17.439786 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.439797 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Dec 16 12:16:17.439893 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 66 Dec 16 12:16:17.439976 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 66 Dec 16 12:16:17.440058 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.440142 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 67 Dec 16 12:16:17.440223 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 67 Dec 16 12:16:17.440303 kernel: pcieport 0000:00:03.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.440386 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 68 Dec 16 12:16:17.440467 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 68 Dec 16 12:16:17.440547 kernel: pcieport 0000:00:03.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.440633 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 69 Dec 16 12:16:17.440713 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 69 Dec 16 12:16:17.440793 kernel: pcieport 0000:00:03.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.440898 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 70 Dec 16 12:16:17.440987 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 70 Dec 16 12:16:17.441088 kernel: pcieport 0000:00:03.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.441183 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 71 Dec 16 12:16:17.441266 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 71 Dec 16 12:16:17.441356 kernel: pcieport 0000:00:03.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.441453 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 72 Dec 16 12:16:17.441542 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 72 Dec 16 12:16:17.441631 kernel: pcieport 0000:00:03.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.441732 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 73 Dec 16 12:16:17.441839 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 73 Dec 16 12:16:17.441928 kernel: pcieport 0000:00:03.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.441941 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Dec 16 12:16:17.442028 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 74 Dec 16 12:16:17.442139 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 74 Dec 16 12:16:17.442242 kernel: pcieport 0000:00:04.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.442350 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 75 Dec 16 12:16:17.442449 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 75 Dec 16 12:16:17.442545 kernel: pcieport 0000:00:04.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.442638 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 76 Dec 16 12:16:17.442722 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 76 Dec 16 12:16:17.442804 kernel: pcieport 0000:00:04.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.442925 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 77 Dec 16 12:16:17.443008 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 77 Dec 16 12:16:17.443088 kernel: pcieport 0000:00:04.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.443171 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 78 Dec 16 12:16:17.443257 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 78 Dec 16 12:16:17.443349 kernel: pcieport 0000:00:04.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.443434 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 79 Dec 16 12:16:17.443516 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 79 Dec 16 12:16:17.443596 kernel: pcieport 0000:00:04.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.443679 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 80 Dec 16 12:16:17.443759 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 80 Dec 16 12:16:17.443852 kernel: pcieport 0000:00:04.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.443940 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 81 Dec 16 12:16:17.444020 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 81 Dec 16 12:16:17.444099 kernel: pcieport 0000:00:04.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.444181 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 82 Dec 16 12:16:17.444261 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 82 Dec 16 12:16:17.444340 kernel: pcieport 0000:00:05.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:16:17.444351 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Dec 16 12:16:17.444361 kernel: ACPI: button: Power Button [PWRB] Dec 16 12:16:17.444446 kernel: virtio-pci 0000:01:00.0: enabling device (0000 -> 0002) Dec 16 12:16:17.444534 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Dec 16 12:16:17.444545 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Dec 16 12:16:17.444553 kernel: thunder_xcv, ver 1.0 Dec 16 12:16:17.444561 kernel: thunder_bgx, ver 1.0 Dec 16 12:16:17.444569 kernel: nicpf, ver 1.0 Dec 16 12:16:17.444578 kernel: nicvf, ver 1.0 Dec 16 12:16:17.444671 kernel: rtc-efi rtc-efi.0: registered as rtc0 Dec 16 12:16:17.444749 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-12-16T12:16:16 UTC (1765887376) Dec 16 12:16:17.444759 kernel: hid: raw HID events driver (C) Jiri Kosina Dec 16 12:16:17.444767 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Dec 16 12:16:17.444775 kernel: watchdog: NMI not fully supported Dec 16 12:16:17.444785 kernel: watchdog: Hard watchdog permanently disabled Dec 16 12:16:17.444793 kernel: NET: Registered PF_INET6 protocol family Dec 16 12:16:17.444801 kernel: Segment Routing with IPv6 Dec 16 12:16:17.444818 kernel: In-situ OAM (IOAM) with IPv6 Dec 16 12:16:17.444827 kernel: NET: Registered PF_PACKET protocol family Dec 16 12:16:17.444848 kernel: Key type dns_resolver registered Dec 16 12:16:17.444856 kernel: registered taskstats version 1 Dec 16 12:16:17.444866 kernel: Loading compiled-in X.509 certificates Dec 16 12:16:17.444874 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.61-flatcar: 545838337a91b65b763486e536766b3eec3ef99d' Dec 16 12:16:17.444882 kernel: Demotion targets for Node 0: null Dec 16 12:16:17.444890 kernel: Key type .fscrypt registered Dec 16 12:16:17.444898 kernel: Key type fscrypt-provisioning registered Dec 16 12:16:17.444906 kernel: ima: No TPM chip found, activating TPM-bypass! Dec 16 12:16:17.444914 kernel: ima: Allocated hash algorithm: sha1 Dec 16 12:16:17.444922 kernel: ima: No architecture policies found Dec 16 12:16:17.444932 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Dec 16 12:16:17.444940 kernel: clk: Disabling unused clocks Dec 16 12:16:17.444948 kernel: PM: genpd: Disabling unused power domains Dec 16 12:16:17.444956 kernel: Freeing unused kernel memory: 12480K Dec 16 12:16:17.444964 kernel: Run /init as init process Dec 16 12:16:17.444971 kernel: with arguments: Dec 16 12:16:17.444979 kernel: /init Dec 16 12:16:17.444988 kernel: with environment: Dec 16 12:16:17.444996 kernel: HOME=/ Dec 16 12:16:17.445004 kernel: TERM=linux Dec 16 12:16:17.445011 kernel: ACPI: bus type USB registered Dec 16 12:16:17.445019 kernel: usbcore: registered new interface driver usbfs Dec 16 12:16:17.445028 kernel: usbcore: registered new interface driver hub Dec 16 12:16:17.445035 kernel: usbcore: registered new device driver usb Dec 16 12:16:17.445132 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 16 12:16:17.445217 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Dec 16 12:16:17.445298 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Dec 16 12:16:17.445379 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 16 12:16:17.445460 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Dec 16 12:16:17.445540 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Dec 16 12:16:17.445653 kernel: hub 1-0:1.0: USB hub found Dec 16 12:16:17.445753 kernel: hub 1-0:1.0: 4 ports detected Dec 16 12:16:17.445873 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Dec 16 12:16:17.445975 kernel: hub 2-0:1.0: USB hub found Dec 16 12:16:17.446081 kernel: hub 2-0:1.0: 4 ports detected Dec 16 12:16:17.446183 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Dec 16 12:16:17.446271 kernel: virtio_blk virtio1: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Dec 16 12:16:17.446282 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Dec 16 12:16:17.446291 kernel: GPT:25804799 != 104857599 Dec 16 12:16:17.446299 kernel: GPT:Alternate GPT header not at the end of the disk. Dec 16 12:16:17.446308 kernel: GPT:25804799 != 104857599 Dec 16 12:16:17.446316 kernel: GPT: Use GNU Parted to correct GPT errors. Dec 16 12:16:17.446326 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 16 12:16:17.446334 kernel: SCSI subsystem initialized Dec 16 12:16:17.446342 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Dec 16 12:16:17.446351 kernel: device-mapper: uevent: version 1.0.3 Dec 16 12:16:17.446359 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Dec 16 12:16:17.446368 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Dec 16 12:16:17.446378 kernel: raid6: neonx8 gen() 15720 MB/s Dec 16 12:16:17.446386 kernel: raid6: neonx4 gen() 15764 MB/s Dec 16 12:16:17.446394 kernel: raid6: neonx2 gen() 13337 MB/s Dec 16 12:16:17.446402 kernel: raid6: neonx1 gen() 10511 MB/s Dec 16 12:16:17.446410 kernel: raid6: int64x8 gen() 6842 MB/s Dec 16 12:16:17.446418 kernel: raid6: int64x4 gen() 7344 MB/s Dec 16 12:16:17.446427 kernel: raid6: int64x2 gen() 6114 MB/s Dec 16 12:16:17.446435 kernel: raid6: int64x1 gen() 5041 MB/s Dec 16 12:16:17.446444 kernel: raid6: using algorithm neonx4 gen() 15764 MB/s Dec 16 12:16:17.446453 kernel: raid6: .... xor() 12258 MB/s, rmw enabled Dec 16 12:16:17.446461 kernel: raid6: using neon recovery algorithm Dec 16 12:16:17.446470 kernel: xor: measuring software checksum speed Dec 16 12:16:17.446479 kernel: 8regs : 21584 MB/sec Dec 16 12:16:17.446487 kernel: 32regs : 20828 MB/sec Dec 16 12:16:17.446498 kernel: arm64_neon : 28147 MB/sec Dec 16 12:16:17.446506 kernel: xor: using function: arm64_neon (28147 MB/sec) Dec 16 12:16:17.446610 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Dec 16 12:16:17.446623 kernel: Btrfs loaded, zoned=no, fsverity=no Dec 16 12:16:17.446632 kernel: BTRFS: device fsid d00a2bc5-1c68-4957-aa37-d070193fcf05 devid 1 transid 36 /dev/mapper/usr (253:0) scanned by mount (274) Dec 16 12:16:17.446640 kernel: BTRFS info (device dm-0): first mount of filesystem d00a2bc5-1c68-4957-aa37-d070193fcf05 Dec 16 12:16:17.446649 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Dec 16 12:16:17.446770 kernel: BTRFS info (device dm-0): disabling log replay at mount time Dec 16 12:16:17.446779 kernel: BTRFS info (device dm-0): enabling free space tree Dec 16 12:16:17.446787 kernel: loop: module loaded Dec 16 12:16:17.446796 kernel: loop0: detected capacity change from 0 to 91832 Dec 16 12:16:17.446804 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Dec 16 12:16:17.446949 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Dec 16 12:16:17.446966 systemd[1]: Successfully made /usr/ read-only. Dec 16 12:16:17.446977 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 16 12:16:17.446987 systemd[1]: Detected virtualization kvm. Dec 16 12:16:17.446995 systemd[1]: Detected architecture arm64. Dec 16 12:16:17.447004 systemd[1]: Running in initrd. Dec 16 12:16:17.447012 systemd[1]: No hostname configured, using default hostname. Dec 16 12:16:17.447023 systemd[1]: Hostname set to . Dec 16 12:16:17.447032 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Dec 16 12:16:17.447040 systemd[1]: Queued start job for default target initrd.target. Dec 16 12:16:17.447049 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Dec 16 12:16:17.447058 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 12:16:17.447067 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 12:16:17.447078 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Dec 16 12:16:17.447087 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 16 12:16:17.447096 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Dec 16 12:16:17.447105 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Dec 16 12:16:17.447114 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 12:16:17.447123 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 16 12:16:17.447133 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Dec 16 12:16:17.447142 systemd[1]: Reached target paths.target - Path Units. Dec 16 12:16:17.447150 systemd[1]: Reached target slices.target - Slice Units. Dec 16 12:16:17.447159 systemd[1]: Reached target swap.target - Swaps. Dec 16 12:16:17.447167 systemd[1]: Reached target timers.target - Timer Units. Dec 16 12:16:17.447176 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Dec 16 12:16:17.447192 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 16 12:16:17.447207 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Dec 16 12:16:17.447216 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Dec 16 12:16:17.447224 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Dec 16 12:16:17.447233 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 16 12:16:17.447242 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 16 12:16:17.447251 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 12:16:17.447261 systemd[1]: Reached target sockets.target - Socket Units. Dec 16 12:16:17.447270 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Dec 16 12:16:17.447279 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Dec 16 12:16:17.447288 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 16 12:16:17.447296 systemd[1]: Finished network-cleanup.service - Network Cleanup. Dec 16 12:16:17.447305 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Dec 16 12:16:17.447314 systemd[1]: Starting systemd-fsck-usr.service... Dec 16 12:16:17.447324 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 16 12:16:17.447333 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 16 12:16:17.447342 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:16:17.447351 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Dec 16 12:16:17.447361 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 12:16:17.447370 systemd[1]: Finished systemd-fsck-usr.service. Dec 16 12:16:17.447380 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 16 12:16:17.447414 systemd-journald[416]: Collecting audit messages is enabled. Dec 16 12:16:17.447436 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Dec 16 12:16:17.447445 kernel: Bridge firewalling registered Dec 16 12:16:17.447454 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 16 12:16:17.447463 kernel: audit: type=1130 audit(1765887377.383:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:17.447472 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 16 12:16:17.447482 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:16:17.447491 kernel: audit: type=1130 audit(1765887377.390:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:17.447500 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 12:16:17.447510 kernel: audit: type=1130 audit(1765887377.400:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:17.447518 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Dec 16 12:16:17.447527 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 16 12:16:17.447538 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 16 12:16:17.447548 kernel: audit: type=1130 audit(1765887377.410:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:17.447556 kernel: audit: type=1334 audit(1765887377.414:6): prog-id=6 op=LOAD Dec 16 12:16:17.447566 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 16 12:16:17.447575 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 12:16:17.447584 kernel: audit: type=1130 audit(1765887377.426:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:17.447593 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 16 12:16:17.447603 kernel: audit: type=1130 audit(1765887377.435:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:17.447612 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Dec 16 12:16:17.447621 systemd-journald[416]: Journal started Dec 16 12:16:17.447640 systemd-journald[416]: Runtime Journal (/run/log/journal/e5d80d44913a4451a6825aeb7bab1477) is 8M, max 319.5M, 311.5M free. Dec 16 12:16:17.383000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:17.390000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:17.400000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:17.410000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:17.414000 audit: BPF prog-id=6 op=LOAD Dec 16 12:16:17.426000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:17.435000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:17.379995 systemd-modules-load[418]: Inserted module 'br_netfilter' Dec 16 12:16:17.454623 systemd[1]: Started systemd-journald.service - Journal Service. Dec 16 12:16:17.453000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:17.457835 kernel: audit: type=1130 audit(1765887377.453:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:17.458630 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 16 12:16:17.465803 systemd-resolved[432]: Positive Trust Anchors: Dec 16 12:16:17.465829 systemd-resolved[432]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 16 12:16:17.465832 systemd-resolved[432]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Dec 16 12:16:17.465863 systemd-resolved[432]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 16 12:16:17.473272 systemd-tmpfiles[453]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Dec 16 12:16:17.482543 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 12:16:17.482000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:17.486930 dracut-cmdline[444]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=756b815c2fd7ac2947efceb2a88878d1ea9723ec85037c2b4d1a09bd798bb749 Dec 16 12:16:17.491399 kernel: audit: type=1130 audit(1765887377.482:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:17.491617 systemd-resolved[432]: Defaulting to hostname 'linux'. Dec 16 12:16:17.492447 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 16 12:16:17.492000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:17.493805 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 16 12:16:17.565876 kernel: Loading iSCSI transport class v2.0-870. Dec 16 12:16:17.575975 kernel: iscsi: registered transport (tcp) Dec 16 12:16:17.589968 kernel: iscsi: registered transport (qla4xxx) Dec 16 12:16:17.589994 kernel: QLogic iSCSI HBA Driver Dec 16 12:16:17.611171 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 16 12:16:17.632441 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 12:16:17.632000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:17.634595 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 16 12:16:17.681486 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Dec 16 12:16:17.681000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:17.683921 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Dec 16 12:16:17.685400 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Dec 16 12:16:17.716008 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Dec 16 12:16:17.716000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:17.717000 audit: BPF prog-id=7 op=LOAD Dec 16 12:16:17.717000 audit: BPF prog-id=8 op=LOAD Dec 16 12:16:17.718694 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 12:16:17.748718 systemd-udevd[693]: Using default interface naming scheme 'v257'. Dec 16 12:16:17.756000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:17.756570 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 12:16:17.760756 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Dec 16 12:16:17.784467 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 16 12:16:17.785000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:17.785000 audit: BPF prog-id=9 op=LOAD Dec 16 12:16:17.786800 dracut-pre-trigger[766]: rd.md=0: removing MD RAID activation Dec 16 12:16:17.787176 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 16 12:16:17.812853 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Dec 16 12:16:17.812000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:17.815110 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 16 12:16:17.829249 systemd-networkd[805]: lo: Link UP Dec 16 12:16:17.829259 systemd-networkd[805]: lo: Gained carrier Dec 16 12:16:17.829923 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 16 12:16:17.830000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:17.830920 systemd[1]: Reached target network.target - Network. Dec 16 12:16:17.900221 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 12:16:17.901000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:17.906546 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Dec 16 12:16:17.968097 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Dec 16 12:16:17.976795 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Dec 16 12:16:17.984283 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Dec 16 12:16:17.998110 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Dec 16 12:16:18.001973 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Dec 16 12:16:18.009837 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Dec 16 12:16:18.011847 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Dec 16 12:16:18.014825 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Dec 16 12:16:18.016842 disk-uuid[870]: Primary Header is updated. Dec 16 12:16:18.016842 disk-uuid[870]: Secondary Entries is updated. Dec 16 12:16:18.016842 disk-uuid[870]: Secondary Header is updated. Dec 16 12:16:18.042600 systemd-networkd[805]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:16:18.042616 systemd-networkd[805]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 12:16:18.043062 systemd-networkd[805]: eth0: Link UP Dec 16 12:16:18.045745 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 12:16:18.047000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:18.045945 systemd-networkd[805]: eth0: Gained carrier Dec 16 12:16:18.045959 systemd-networkd[805]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:16:18.046127 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:16:18.048523 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:16:18.052040 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:16:18.068473 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Dec 16 12:16:18.068696 kernel: usbcore: registered new interface driver usbhid Dec 16 12:16:18.069325 kernel: usbhid: USB HID core driver Dec 16 12:16:18.086126 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:16:18.086000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:18.106889 systemd-networkd[805]: eth0: DHCPv4 address 10.0.29.66/25, gateway 10.0.29.1 acquired from 10.0.29.1 Dec 16 12:16:18.119870 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Dec 16 12:16:18.119000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:18.121327 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Dec 16 12:16:18.122870 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 12:16:18.124709 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 16 12:16:18.127522 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Dec 16 12:16:18.164882 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Dec 16 12:16:18.164000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:19.052925 disk-uuid[871]: Warning: The kernel is still using the old partition table. Dec 16 12:16:19.052925 disk-uuid[871]: The new table will be used at the next reboot or after you Dec 16 12:16:19.052925 disk-uuid[871]: run partprobe(8) or kpartx(8) Dec 16 12:16:19.052925 disk-uuid[871]: The operation has completed successfully. Dec 16 12:16:19.058409 systemd[1]: disk-uuid.service: Deactivated successfully. Dec 16 12:16:19.058000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:19.058000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:19.058518 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Dec 16 12:16:19.060612 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Dec 16 12:16:19.104870 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (905) Dec 16 12:16:19.104986 kernel: BTRFS info (device vda6): first mount of filesystem eb4bb268-dde2-45a9-b660-8899d8790a47 Dec 16 12:16:19.105832 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Dec 16 12:16:19.111081 kernel: BTRFS info (device vda6): turning on async discard Dec 16 12:16:19.111152 kernel: BTRFS info (device vda6): enabling free space tree Dec 16 12:16:19.116835 kernel: BTRFS info (device vda6): last unmount of filesystem eb4bb268-dde2-45a9-b660-8899d8790a47 Dec 16 12:16:19.117954 systemd[1]: Finished ignition-setup.service - Ignition (setup). Dec 16 12:16:19.118000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:19.120743 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Dec 16 12:16:19.261571 ignition[924]: Ignition 2.24.0 Dec 16 12:16:19.262343 ignition[924]: Stage: fetch-offline Dec 16 12:16:19.262389 ignition[924]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:16:19.262408 ignition[924]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 12:16:19.265000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:19.263951 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Dec 16 12:16:19.262585 ignition[924]: parsed url from cmdline: "" Dec 16 12:16:19.266466 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Dec 16 12:16:19.262590 ignition[924]: no config URL provided Dec 16 12:16:19.262595 ignition[924]: reading system config file "/usr/lib/ignition/user.ign" Dec 16 12:16:19.262604 ignition[924]: no config at "/usr/lib/ignition/user.ign" Dec 16 12:16:19.262608 ignition[924]: failed to fetch config: resource requires networking Dec 16 12:16:19.262772 ignition[924]: Ignition finished successfully Dec 16 12:16:19.290636 ignition[935]: Ignition 2.24.0 Dec 16 12:16:19.290654 ignition[935]: Stage: fetch Dec 16 12:16:19.290795 ignition[935]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:16:19.290804 ignition[935]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 12:16:19.290903 ignition[935]: parsed url from cmdline: "" Dec 16 12:16:19.290907 ignition[935]: no config URL provided Dec 16 12:16:19.290911 ignition[935]: reading system config file "/usr/lib/ignition/user.ign" Dec 16 12:16:19.290917 ignition[935]: no config at "/usr/lib/ignition/user.ign" Dec 16 12:16:19.291301 ignition[935]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Dec 16 12:16:19.291318 ignition[935]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Dec 16 12:16:19.291547 ignition[935]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Dec 16 12:16:19.445133 systemd-networkd[805]: eth0: Gained IPv6LL Dec 16 12:16:20.291555 ignition[935]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Dec 16 12:16:20.291710 ignition[935]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Dec 16 12:16:20.343283 ignition[935]: GET result: OK Dec 16 12:16:20.343540 ignition[935]: parsing config with SHA512: 844ebb582cf450e71253e76426c07b8dce29febbefc468ebeb5936f0a22d5b917b9cc0de910409629bc09c2087902a880847be80f1a3af9f2d93d244807830fc Dec 16 12:16:20.348671 unknown[935]: fetched base config from "system" Dec 16 12:16:20.348682 unknown[935]: fetched base config from "system" Dec 16 12:16:20.349039 ignition[935]: fetch: fetch complete Dec 16 12:16:20.348687 unknown[935]: fetched user config from "openstack" Dec 16 12:16:20.349043 ignition[935]: fetch: fetch passed Dec 16 12:16:20.355893 kernel: kauditd_printk_skb: 20 callbacks suppressed Dec 16 12:16:20.355918 kernel: audit: type=1130 audit(1765887380.351:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:20.351000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:20.351312 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Dec 16 12:16:20.349082 ignition[935]: Ignition finished successfully Dec 16 12:16:20.353253 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Dec 16 12:16:20.378577 ignition[943]: Ignition 2.24.0 Dec 16 12:16:20.378596 ignition[943]: Stage: kargs Dec 16 12:16:20.378749 ignition[943]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:16:20.378757 ignition[943]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 12:16:20.379512 ignition[943]: kargs: kargs passed Dec 16 12:16:20.379559 ignition[943]: Ignition finished successfully Dec 16 12:16:20.382169 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Dec 16 12:16:20.383000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:20.384436 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Dec 16 12:16:20.387670 kernel: audit: type=1130 audit(1765887380.383:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:20.412049 ignition[950]: Ignition 2.24.0 Dec 16 12:16:20.412063 ignition[950]: Stage: disks Dec 16 12:16:20.412212 ignition[950]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:16:20.412220 ignition[950]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 12:16:20.412994 ignition[950]: disks: disks passed Dec 16 12:16:20.415113 systemd[1]: Finished ignition-disks.service - Ignition (disks). Dec 16 12:16:20.418900 kernel: audit: type=1130 audit(1765887380.416:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:20.416000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:20.413039 ignition[950]: Ignition finished successfully Dec 16 12:16:20.416340 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Dec 16 12:16:20.420234 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Dec 16 12:16:20.421711 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 16 12:16:20.423331 systemd[1]: Reached target sysinit.target - System Initialization. Dec 16 12:16:20.424926 systemd[1]: Reached target basic.target - Basic System. Dec 16 12:16:20.427411 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Dec 16 12:16:20.473941 systemd-fsck[959]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Dec 16 12:16:20.477661 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Dec 16 12:16:20.479000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:20.481025 systemd[1]: Mounting sysroot.mount - /sysroot... Dec 16 12:16:20.483725 kernel: audit: type=1130 audit(1765887380.479:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:20.588837 kernel: EXT4-fs (vda9): mounted filesystem 0e69f709-36a9-4e15-b0c9-c7e150185653 r/w with ordered data mode. Quota mode: none. Dec 16 12:16:20.589590 systemd[1]: Mounted sysroot.mount - /sysroot. Dec 16 12:16:20.590827 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Dec 16 12:16:20.594083 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 16 12:16:20.595874 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Dec 16 12:16:20.596747 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Dec 16 12:16:20.597374 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Dec 16 12:16:20.598580 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Dec 16 12:16:20.598611 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Dec 16 12:16:20.612990 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Dec 16 12:16:20.615044 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Dec 16 12:16:20.626848 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (967) Dec 16 12:16:20.630637 kernel: BTRFS info (device vda6): first mount of filesystem eb4bb268-dde2-45a9-b660-8899d8790a47 Dec 16 12:16:20.630685 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Dec 16 12:16:20.635875 kernel: BTRFS info (device vda6): turning on async discard Dec 16 12:16:20.635915 kernel: BTRFS info (device vda6): enabling free space tree Dec 16 12:16:20.640880 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 16 12:16:20.667841 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:16:20.769579 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Dec 16 12:16:20.769000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:20.771909 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Dec 16 12:16:20.775087 kernel: audit: type=1130 audit(1765887380.769:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:20.775133 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Dec 16 12:16:20.796928 systemd[1]: sysroot-oem.mount: Deactivated successfully. Dec 16 12:16:20.798315 kernel: BTRFS info (device vda6): last unmount of filesystem eb4bb268-dde2-45a9-b660-8899d8790a47 Dec 16 12:16:20.819134 ignition[1068]: INFO : Ignition 2.24.0 Dec 16 12:16:20.819134 ignition[1068]: INFO : Stage: mount Dec 16 12:16:20.823000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:20.819965 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Dec 16 12:16:20.831496 kernel: audit: type=1130 audit(1765887380.823:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:20.831521 kernel: audit: type=1130 audit(1765887380.827:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:20.827000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:20.831566 ignition[1068]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 12:16:20.831566 ignition[1068]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 12:16:20.831566 ignition[1068]: INFO : mount: mount passed Dec 16 12:16:20.831566 ignition[1068]: INFO : Ignition finished successfully Dec 16 12:16:20.824922 systemd[1]: Finished ignition-mount.service - Ignition (mount). Dec 16 12:16:21.709862 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:16:23.718863 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:16:27.727850 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:16:27.731887 coreos-metadata[969]: Dec 16 12:16:27.731 WARN failed to locate config-drive, using the metadata service API instead Dec 16 12:16:27.750582 coreos-metadata[969]: Dec 16 12:16:27.750 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Dec 16 12:16:27.886612 coreos-metadata[969]: Dec 16 12:16:27.886 INFO Fetch successful Dec 16 12:16:27.887656 coreos-metadata[969]: Dec 16 12:16:27.886 INFO wrote hostname ci-4547-0-0-5-b12717c6ea to /sysroot/etc/hostname Dec 16 12:16:27.889292 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Dec 16 12:16:27.889381 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Dec 16 12:16:27.894000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:27.895901 systemd[1]: Starting ignition-files.service - Ignition (files)... Dec 16 12:16:27.901635 kernel: audit: type=1130 audit(1765887387.894:38): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:27.901661 kernel: audit: type=1131 audit(1765887387.894:39): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:27.894000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:27.924319 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 16 12:16:27.961831 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1088) Dec 16 12:16:27.966050 kernel: BTRFS info (device vda6): first mount of filesystem eb4bb268-dde2-45a9-b660-8899d8790a47 Dec 16 12:16:27.966139 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Dec 16 12:16:27.971068 kernel: BTRFS info (device vda6): turning on async discard Dec 16 12:16:27.971098 kernel: BTRFS info (device vda6): enabling free space tree Dec 16 12:16:27.972667 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 16 12:16:27.996493 ignition[1106]: INFO : Ignition 2.24.0 Dec 16 12:16:27.996493 ignition[1106]: INFO : Stage: files Dec 16 12:16:27.998322 ignition[1106]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 12:16:27.998322 ignition[1106]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 12:16:27.998322 ignition[1106]: DEBUG : files: compiled without relabeling support, skipping Dec 16 12:16:28.001682 ignition[1106]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Dec 16 12:16:28.001682 ignition[1106]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Dec 16 12:16:28.004647 ignition[1106]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Dec 16 12:16:28.004647 ignition[1106]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Dec 16 12:16:28.004647 ignition[1106]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Dec 16 12:16:28.003083 unknown[1106]: wrote ssh authorized keys file for user: core Dec 16 12:16:28.011358 ignition[1106]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Dec 16 12:16:28.011358 ignition[1106]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Dec 16 12:16:28.074029 ignition[1106]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Dec 16 12:16:28.221776 ignition[1106]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Dec 16 12:16:28.223482 ignition[1106]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Dec 16 12:16:28.223482 ignition[1106]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Dec 16 12:16:28.223482 ignition[1106]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Dec 16 12:16:28.223482 ignition[1106]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Dec 16 12:16:28.223482 ignition[1106]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 16 12:16:28.223482 ignition[1106]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 16 12:16:28.223482 ignition[1106]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 16 12:16:28.223482 ignition[1106]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 16 12:16:28.235253 ignition[1106]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Dec 16 12:16:28.235253 ignition[1106]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Dec 16 12:16:28.235253 ignition[1106]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Dec 16 12:16:28.235253 ignition[1106]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Dec 16 12:16:28.235253 ignition[1106]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Dec 16 12:16:28.235253 ignition[1106]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.1-arm64.raw: attempt #1 Dec 16 12:16:28.488190 ignition[1106]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Dec 16 12:16:29.023161 ignition[1106]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Dec 16 12:16:29.023161 ignition[1106]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Dec 16 12:16:29.027066 ignition[1106]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 16 12:16:29.027066 ignition[1106]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 16 12:16:29.027066 ignition[1106]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Dec 16 12:16:29.027066 ignition[1106]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Dec 16 12:16:29.027066 ignition[1106]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Dec 16 12:16:29.027066 ignition[1106]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Dec 16 12:16:29.027066 ignition[1106]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Dec 16 12:16:29.027066 ignition[1106]: INFO : files: files passed Dec 16 12:16:29.027066 ignition[1106]: INFO : Ignition finished successfully Dec 16 12:16:29.043670 kernel: audit: type=1130 audit(1765887389.031:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.031000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.030890 systemd[1]: Finished ignition-files.service - Ignition (files). Dec 16 12:16:29.033928 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Dec 16 12:16:29.038084 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Dec 16 12:16:29.048950 systemd[1]: ignition-quench.service: Deactivated successfully. Dec 16 12:16:29.049047 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Dec 16 12:16:29.056155 kernel: audit: type=1130 audit(1765887389.050:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.056183 kernel: audit: type=1131 audit(1765887389.050:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.050000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.050000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.056264 initrd-setup-root-after-ignition[1139]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 16 12:16:29.056264 initrd-setup-root-after-ignition[1139]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Dec 16 12:16:29.059187 initrd-setup-root-after-ignition[1143]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 16 12:16:29.060491 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 16 12:16:29.061000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.061736 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Dec 16 12:16:29.069179 kernel: audit: type=1130 audit(1765887389.061:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.069269 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Dec 16 12:16:29.119084 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Dec 16 12:16:29.119221 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Dec 16 12:16:29.126326 kernel: audit: type=1130 audit(1765887389.120:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.126362 kernel: audit: type=1131 audit(1765887389.120:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.120000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.120000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.121045 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Dec 16 12:16:29.127124 systemd[1]: Reached target initrd.target - Initrd Default Target. Dec 16 12:16:29.128883 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Dec 16 12:16:29.129793 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Dec 16 12:16:29.166995 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 16 12:16:29.167000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.170628 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Dec 16 12:16:29.172938 kernel: audit: type=1130 audit(1765887389.167:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.190396 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Dec 16 12:16:29.190609 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Dec 16 12:16:29.192458 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 12:16:29.194134 systemd[1]: Stopped target timers.target - Timer Units. Dec 16 12:16:29.195638 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Dec 16 12:16:29.196000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.199873 kernel: audit: type=1131 audit(1765887389.196:47): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.195765 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 16 12:16:29.200122 systemd[1]: Stopped target initrd.target - Initrd Default Target. Dec 16 12:16:29.201687 systemd[1]: Stopped target basic.target - Basic System. Dec 16 12:16:29.203328 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Dec 16 12:16:29.204699 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Dec 16 12:16:29.206401 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Dec 16 12:16:29.208014 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Dec 16 12:16:29.209650 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Dec 16 12:16:29.211239 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Dec 16 12:16:29.212924 systemd[1]: Stopped target sysinit.target - System Initialization. Dec 16 12:16:29.214707 systemd[1]: Stopped target local-fs.target - Local File Systems. Dec 16 12:16:29.216199 systemd[1]: Stopped target swap.target - Swaps. Dec 16 12:16:29.217507 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Dec 16 12:16:29.218000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.217639 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Dec 16 12:16:29.219562 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Dec 16 12:16:29.220578 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 12:16:29.222196 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Dec 16 12:16:29.225000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.222276 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 12:16:29.223794 systemd[1]: dracut-initqueue.service: Deactivated successfully. Dec 16 12:16:29.227000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.223930 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Dec 16 12:16:29.230000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.226750 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Dec 16 12:16:29.226889 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 16 12:16:29.228858 systemd[1]: ignition-files.service: Deactivated successfully. Dec 16 12:16:29.228965 systemd[1]: Stopped ignition-files.service - Ignition (files). Dec 16 12:16:29.235000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.231167 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Dec 16 12:16:29.237000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.233307 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Dec 16 12:16:29.238000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.234274 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Dec 16 12:16:29.234392 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 12:16:29.235968 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Dec 16 12:16:29.236072 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 12:16:29.237548 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Dec 16 12:16:29.237656 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Dec 16 12:16:29.242476 systemd[1]: initrd-cleanup.service: Deactivated successfully. Dec 16 12:16:29.244965 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Dec 16 12:16:29.246000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.246000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.256875 ignition[1163]: INFO : Ignition 2.24.0 Dec 16 12:16:29.256875 ignition[1163]: INFO : Stage: umount Dec 16 12:16:29.258622 ignition[1163]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 12:16:29.258622 ignition[1163]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 12:16:29.258622 ignition[1163]: INFO : umount: umount passed Dec 16 12:16:29.258622 ignition[1163]: INFO : Ignition finished successfully Dec 16 12:16:29.265000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.259643 systemd[1]: sysroot-boot.mount: Deactivated successfully. Dec 16 12:16:29.265000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.262010 systemd[1]: ignition-mount.service: Deactivated successfully. Dec 16 12:16:29.268000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.270000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.271000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.262193 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Dec 16 12:16:29.265630 systemd[1]: sysroot-boot.service: Deactivated successfully. Dec 16 12:16:29.274000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.265745 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Dec 16 12:16:29.266908 systemd[1]: ignition-disks.service: Deactivated successfully. Dec 16 12:16:29.266949 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Dec 16 12:16:29.268989 systemd[1]: ignition-kargs.service: Deactivated successfully. Dec 16 12:16:29.269036 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Dec 16 12:16:29.270252 systemd[1]: ignition-fetch.service: Deactivated successfully. Dec 16 12:16:29.270305 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Dec 16 12:16:29.271146 systemd[1]: Stopped target network.target - Network. Dec 16 12:16:29.273179 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Dec 16 12:16:29.286000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.273231 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Dec 16 12:16:29.288000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.274850 systemd[1]: Stopped target paths.target - Path Units. Dec 16 12:16:29.289000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.276192 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Dec 16 12:16:29.276887 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 12:16:29.278537 systemd[1]: Stopped target slices.target - Slice Units. Dec 16 12:16:29.279909 systemd[1]: Stopped target sockets.target - Socket Units. Dec 16 12:16:29.281288 systemd[1]: iscsid.socket: Deactivated successfully. Dec 16 12:16:29.281325 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Dec 16 12:16:29.282632 systemd[1]: iscsiuio.socket: Deactivated successfully. Dec 16 12:16:29.282660 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 16 12:16:29.284076 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Dec 16 12:16:29.284096 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Dec 16 12:16:29.285579 systemd[1]: ignition-setup.service: Deactivated successfully. Dec 16 12:16:29.302000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.285632 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Dec 16 12:16:29.286939 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Dec 16 12:16:29.286981 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Dec 16 12:16:29.288456 systemd[1]: initrd-setup-root.service: Deactivated successfully. Dec 16 12:16:29.288501 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Dec 16 12:16:29.307000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.307000 audit: BPF prog-id=6 op=UNLOAD Dec 16 12:16:29.290188 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Dec 16 12:16:29.291580 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Dec 16 12:16:29.309000 audit: BPF prog-id=9 op=UNLOAD Dec 16 12:16:29.301434 systemd[1]: systemd-resolved.service: Deactivated successfully. Dec 16 12:16:29.301547 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Dec 16 12:16:29.306390 systemd[1]: systemd-networkd.service: Deactivated successfully. Dec 16 12:16:29.306503 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Dec 16 12:16:29.314000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.309569 systemd[1]: Stopped target network-pre.target - Preparation for Network. Dec 16 12:16:29.317000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.311101 systemd[1]: systemd-networkd.socket: Deactivated successfully. Dec 16 12:16:29.318000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.311150 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Dec 16 12:16:29.313360 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Dec 16 12:16:29.314139 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Dec 16 12:16:29.314196 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 16 12:16:29.315785 systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 16 12:16:29.315847 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Dec 16 12:16:29.317384 systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 16 12:16:29.317424 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Dec 16 12:16:29.318974 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 12:16:29.335266 systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 16 12:16:29.335437 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 12:16:29.337000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.338639 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Dec 16 12:16:29.338732 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Dec 16 12:16:29.340535 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Dec 16 12:16:29.340567 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 12:16:29.343000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.342086 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Dec 16 12:16:29.342136 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Dec 16 12:16:29.345000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.344412 systemd[1]: dracut-cmdline.service: Deactivated successfully. Dec 16 12:16:29.344461 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Dec 16 12:16:29.348000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.346627 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Dec 16 12:16:29.346673 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 16 12:16:29.352520 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Dec 16 12:16:29.353468 systemd[1]: systemd-network-generator.service: Deactivated successfully. Dec 16 12:16:29.355000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.353525 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 12:16:29.357000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.355507 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Dec 16 12:16:29.359000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.355553 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 12:16:29.360000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.357260 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Dec 16 12:16:29.362000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.357302 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 12:16:29.364000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.359288 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Dec 16 12:16:29.366000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.366000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:29.359329 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 12:16:29.360920 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 12:16:29.360964 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:16:29.363390 systemd[1]: network-cleanup.service: Deactivated successfully. Dec 16 12:16:29.363492 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Dec 16 12:16:29.364615 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Dec 16 12:16:29.364696 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Dec 16 12:16:29.367294 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Dec 16 12:16:29.369784 systemd[1]: Starting initrd-switch-root.service - Switch Root... Dec 16 12:16:29.385761 systemd[1]: Switching root. Dec 16 12:16:29.420138 systemd-journald[416]: Journal stopped Dec 16 12:16:30.283432 systemd-journald[416]: Received SIGTERM from PID 1 (systemd). Dec 16 12:16:30.283507 kernel: SELinux: policy capability network_peer_controls=1 Dec 16 12:16:30.283528 kernel: SELinux: policy capability open_perms=1 Dec 16 12:16:30.283539 kernel: SELinux: policy capability extended_socket_class=1 Dec 16 12:16:30.283551 kernel: SELinux: policy capability always_check_network=0 Dec 16 12:16:30.283564 kernel: SELinux: policy capability cgroup_seclabel=1 Dec 16 12:16:30.283579 kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 16 12:16:30.283593 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Dec 16 12:16:30.283605 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Dec 16 12:16:30.283615 kernel: SELinux: policy capability userspace_initial_context=0 Dec 16 12:16:30.283626 systemd[1]: Successfully loaded SELinux policy in 49.736ms. Dec 16 12:16:30.283644 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.803ms. Dec 16 12:16:30.283656 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 16 12:16:30.283668 systemd[1]: Detected virtualization kvm. Dec 16 12:16:30.283682 systemd[1]: Detected architecture arm64. Dec 16 12:16:30.283694 systemd[1]: Detected first boot. Dec 16 12:16:30.283705 systemd[1]: Hostname set to . Dec 16 12:16:30.283716 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Dec 16 12:16:30.283727 zram_generator::config[1212]: No configuration found. Dec 16 12:16:30.283743 kernel: NET: Registered PF_VSOCK protocol family Dec 16 12:16:30.283754 systemd[1]: Populated /etc with preset unit settings. Dec 16 12:16:30.283768 systemd[1]: initrd-switch-root.service: Deactivated successfully. Dec 16 12:16:30.283781 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Dec 16 12:16:30.283792 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Dec 16 12:16:30.283817 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Dec 16 12:16:30.283832 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Dec 16 12:16:30.283844 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Dec 16 12:16:30.283855 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Dec 16 12:16:30.283866 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Dec 16 12:16:30.283878 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Dec 16 12:16:30.283889 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Dec 16 12:16:30.283900 systemd[1]: Created slice user.slice - User and Session Slice. Dec 16 12:16:30.283912 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 12:16:30.283922 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 12:16:30.283934 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Dec 16 12:16:30.283944 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Dec 16 12:16:30.283960 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Dec 16 12:16:30.283971 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 16 12:16:30.283984 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Dec 16 12:16:30.283995 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 12:16:30.284007 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 16 12:16:30.284019 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Dec 16 12:16:30.284030 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Dec 16 12:16:30.284041 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Dec 16 12:16:30.284053 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Dec 16 12:16:30.284066 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 12:16:30.284077 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 16 12:16:30.284088 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Dec 16 12:16:30.284101 systemd[1]: Reached target slices.target - Slice Units. Dec 16 12:16:30.284112 systemd[1]: Reached target swap.target - Swaps. Dec 16 12:16:30.284122 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Dec 16 12:16:30.284133 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Dec 16 12:16:30.284144 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Dec 16 12:16:30.284155 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Dec 16 12:16:30.284166 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Dec 16 12:16:30.284177 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 16 12:16:30.284188 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Dec 16 12:16:30.284199 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Dec 16 12:16:30.284210 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 16 12:16:30.284221 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 12:16:30.284231 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Dec 16 12:16:30.284244 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Dec 16 12:16:30.284259 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Dec 16 12:16:30.284270 systemd[1]: Mounting media.mount - External Media Directory... Dec 16 12:16:30.284284 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Dec 16 12:16:30.284296 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Dec 16 12:16:30.284308 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Dec 16 12:16:30.284321 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Dec 16 12:16:30.284334 systemd[1]: Reached target machines.target - Containers. Dec 16 12:16:30.284347 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Dec 16 12:16:30.284361 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 12:16:30.284374 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 16 12:16:30.284386 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Dec 16 12:16:30.284399 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 12:16:30.284414 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 16 12:16:30.284426 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 12:16:30.284438 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Dec 16 12:16:30.284451 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 12:16:30.284463 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Dec 16 12:16:30.284478 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Dec 16 12:16:30.284491 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Dec 16 12:16:30.284502 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Dec 16 12:16:30.284513 systemd[1]: Stopped systemd-fsck-usr.service. Dec 16 12:16:30.284524 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 12:16:30.284536 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 16 12:16:30.284549 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 16 12:16:30.284560 kernel: fuse: init (API version 7.41) Dec 16 12:16:30.284571 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 16 12:16:30.284582 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Dec 16 12:16:30.284593 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Dec 16 12:16:30.284604 kernel: ACPI: bus type drm_connector registered Dec 16 12:16:30.284614 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 16 12:16:30.284625 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Dec 16 12:16:30.284637 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Dec 16 12:16:30.284648 systemd[1]: Mounted media.mount - External Media Directory. Dec 16 12:16:30.284659 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Dec 16 12:16:30.284670 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Dec 16 12:16:30.284684 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Dec 16 12:16:30.284719 systemd-journald[1275]: Collecting audit messages is enabled. Dec 16 12:16:30.284744 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 12:16:30.284756 systemd-journald[1275]: Journal started Dec 16 12:16:30.284780 systemd-journald[1275]: Runtime Journal (/run/log/journal/e5d80d44913a4451a6825aeb7bab1477) is 8M, max 319.5M, 311.5M free. Dec 16 12:16:30.134000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Dec 16 12:16:30.225000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.228000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.231000 audit: BPF prog-id=14 op=UNLOAD Dec 16 12:16:30.231000 audit: BPF prog-id=13 op=UNLOAD Dec 16 12:16:30.231000 audit: BPF prog-id=15 op=LOAD Dec 16 12:16:30.232000 audit: BPF prog-id=16 op=LOAD Dec 16 12:16:30.232000 audit: BPF prog-id=17 op=LOAD Dec 16 12:16:30.281000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Dec 16 12:16:30.281000 audit[1275]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=60 a0=4 a1=ffffd9db37b0 a2=4000 a3=0 items=0 ppid=1 pid=1275 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:30.281000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Dec 16 12:16:30.054162 systemd[1]: Queued start job for default target multi-user.target. Dec 16 12:16:30.070146 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Dec 16 12:16:30.070551 systemd[1]: systemd-journald.service: Deactivated successfully. Dec 16 12:16:30.285000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.288821 systemd[1]: Started systemd-journald.service - Journal Service. Dec 16 12:16:30.288000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.289408 systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 16 12:16:30.289624 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Dec 16 12:16:30.290000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.290000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.292078 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 12:16:30.292829 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 12:16:30.292000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.292000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.295089 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 16 12:16:30.296028 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 16 12:16:30.296000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.296000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.297284 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 12:16:30.298830 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 12:16:30.299000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.299000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.300230 systemd[1]: modprobe@fuse.service: Deactivated successfully. Dec 16 12:16:30.300399 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Dec 16 12:16:30.301000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.301000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.301746 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 12:16:30.301964 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 12:16:30.302000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.302000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.303364 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 16 12:16:30.304000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.304863 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 12:16:30.305000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.307098 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Dec 16 12:16:30.308000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.308739 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Dec 16 12:16:30.309000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.310689 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Dec 16 12:16:30.311000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.322734 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 16 12:16:30.324158 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Dec 16 12:16:30.326293 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Dec 16 12:16:30.328199 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Dec 16 12:16:30.329196 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Dec 16 12:16:30.329224 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 16 12:16:30.330939 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Dec 16 12:16:30.332947 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 12:16:30.333069 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Dec 16 12:16:30.340979 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Dec 16 12:16:30.343621 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Dec 16 12:16:30.344743 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 16 12:16:30.345714 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Dec 16 12:16:30.346777 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 16 12:16:30.349448 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 16 12:16:30.353976 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Dec 16 12:16:30.358619 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 16 12:16:30.359672 systemd-journald[1275]: Time spent on flushing to /var/log/journal/e5d80d44913a4451a6825aeb7bab1477 is 33.110ms for 1816 entries. Dec 16 12:16:30.359672 systemd-journald[1275]: System Journal (/var/log/journal/e5d80d44913a4451a6825aeb7bab1477) is 8M, max 588.1M, 580.1M free. Dec 16 12:16:30.404063 systemd-journald[1275]: Received client request to flush runtime journal. Dec 16 12:16:30.404117 kernel: loop1: detected capacity change from 0 to 45344 Dec 16 12:16:30.362000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.368000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.381000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.397000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.362001 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 12:16:30.364935 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Dec 16 12:16:30.366382 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Dec 16 12:16:30.368011 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Dec 16 12:16:30.371694 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Dec 16 12:16:30.377689 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Dec 16 12:16:30.380402 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 16 12:16:30.390722 systemd-tmpfiles[1334]: ACLs are not supported, ignoring. Dec 16 12:16:30.390733 systemd-tmpfiles[1334]: ACLs are not supported, ignoring. Dec 16 12:16:30.394695 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 12:16:30.401094 systemd[1]: Starting systemd-sysusers.service - Create System Users... Dec 16 12:16:30.407916 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Dec 16 12:16:30.409000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.420066 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Dec 16 12:16:30.431000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.432922 kernel: loop2: detected capacity change from 0 to 100192 Dec 16 12:16:30.457039 systemd[1]: Finished systemd-sysusers.service - Create System Users. Dec 16 12:16:30.457000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.458000 audit: BPF prog-id=18 op=LOAD Dec 16 12:16:30.458000 audit: BPF prog-id=19 op=LOAD Dec 16 12:16:30.458000 audit: BPF prog-id=20 op=LOAD Dec 16 12:16:30.459889 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Dec 16 12:16:30.460000 audit: BPF prog-id=21 op=LOAD Dec 16 12:16:30.462130 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 16 12:16:30.464980 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 16 12:16:30.466000 audit: BPF prog-id=22 op=LOAD Dec 16 12:16:30.467000 audit: BPF prog-id=23 op=LOAD Dec 16 12:16:30.467000 audit: BPF prog-id=24 op=LOAD Dec 16 12:16:30.468464 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Dec 16 12:16:30.469000 audit: BPF prog-id=25 op=LOAD Dec 16 12:16:30.471844 kernel: loop3: detected capacity change from 0 to 1648 Dec 16 12:16:30.478000 audit: BPF prog-id=26 op=LOAD Dec 16 12:16:30.478000 audit: BPF prog-id=27 op=LOAD Dec 16 12:16:30.480979 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Dec 16 12:16:30.496443 systemd-tmpfiles[1355]: ACLs are not supported, ignoring. Dec 16 12:16:30.496468 systemd-tmpfiles[1355]: ACLs are not supported, ignoring. Dec 16 12:16:30.500527 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 12:16:30.501000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.503848 kernel: loop4: detected capacity change from 0 to 200800 Dec 16 12:16:30.515037 systemd-nsresourced[1356]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Dec 16 12:16:30.515978 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Dec 16 12:16:30.516000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.522805 systemd[1]: Started systemd-userdbd.service - User Database Manager. Dec 16 12:16:30.524000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.555080 kernel: loop5: detected capacity change from 0 to 45344 Dec 16 12:16:30.567197 systemd-oomd[1353]: No swap; memory pressure usage will be degraded Dec 16 12:16:30.567906 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Dec 16 12:16:30.568000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.570830 kernel: loop6: detected capacity change from 0 to 100192 Dec 16 12:16:30.578245 systemd-resolved[1354]: Positive Trust Anchors: Dec 16 12:16:30.578346 systemd-resolved[1354]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 16 12:16:30.578352 systemd-resolved[1354]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Dec 16 12:16:30.578385 systemd-resolved[1354]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 16 12:16:30.586840 kernel: loop7: detected capacity change from 0 to 1648 Dec 16 12:16:30.591223 systemd-resolved[1354]: Using system hostname 'ci-4547-0-0-5-b12717c6ea'. Dec 16 12:16:30.592611 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 16 12:16:30.594220 kernel: loop1: detected capacity change from 0 to 200800 Dec 16 12:16:30.593000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.594539 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 16 12:16:30.606671 (sd-merge)[1376]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-stackit.raw'. Dec 16 12:16:30.609598 (sd-merge)[1376]: Merged extensions into '/usr'. Dec 16 12:16:30.613533 systemd[1]: Reload requested from client PID 1332 ('systemd-sysext') (unit systemd-sysext.service)... Dec 16 12:16:30.613547 systemd[1]: Reloading... Dec 16 12:16:30.673861 zram_generator::config[1404]: No configuration found. Dec 16 12:16:30.821922 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Dec 16 12:16:30.822004 systemd[1]: Reloading finished in 208 ms. Dec 16 12:16:30.840926 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Dec 16 12:16:30.842353 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Dec 16 12:16:30.841000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.842000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:30.855298 systemd[1]: Starting ensure-sysext.service... Dec 16 12:16:30.858000 audit: BPF prog-id=8 op=UNLOAD Dec 16 12:16:30.858000 audit: BPF prog-id=7 op=UNLOAD Dec 16 12:16:30.857580 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 16 12:16:30.858000 audit: BPF prog-id=28 op=LOAD Dec 16 12:16:30.858000 audit: BPF prog-id=29 op=LOAD Dec 16 12:16:30.860145 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 12:16:30.861000 audit: BPF prog-id=30 op=LOAD Dec 16 12:16:30.861000 audit: BPF prog-id=21 op=UNLOAD Dec 16 12:16:30.861000 audit: BPF prog-id=31 op=LOAD Dec 16 12:16:30.861000 audit: BPF prog-id=18 op=UNLOAD Dec 16 12:16:30.861000 audit: BPF prog-id=32 op=LOAD Dec 16 12:16:30.861000 audit: BPF prog-id=33 op=LOAD Dec 16 12:16:30.861000 audit: BPF prog-id=19 op=UNLOAD Dec 16 12:16:30.861000 audit: BPF prog-id=20 op=UNLOAD Dec 16 12:16:30.862000 audit: BPF prog-id=34 op=LOAD Dec 16 12:16:30.862000 audit: BPF prog-id=15 op=UNLOAD Dec 16 12:16:30.862000 audit: BPF prog-id=35 op=LOAD Dec 16 12:16:30.862000 audit: BPF prog-id=36 op=LOAD Dec 16 12:16:30.862000 audit: BPF prog-id=16 op=UNLOAD Dec 16 12:16:30.862000 audit: BPF prog-id=17 op=UNLOAD Dec 16 12:16:30.863000 audit: BPF prog-id=37 op=LOAD Dec 16 12:16:30.863000 audit: BPF prog-id=22 op=UNLOAD Dec 16 12:16:30.863000 audit: BPF prog-id=38 op=LOAD Dec 16 12:16:30.863000 audit: BPF prog-id=39 op=LOAD Dec 16 12:16:30.863000 audit: BPF prog-id=23 op=UNLOAD Dec 16 12:16:30.863000 audit: BPF prog-id=24 op=UNLOAD Dec 16 12:16:30.867000 audit: BPF prog-id=40 op=LOAD Dec 16 12:16:30.867000 audit: BPF prog-id=25 op=UNLOAD Dec 16 12:16:30.868000 audit: BPF prog-id=41 op=LOAD Dec 16 12:16:30.868000 audit: BPF prog-id=42 op=LOAD Dec 16 12:16:30.868000 audit: BPF prog-id=26 op=UNLOAD Dec 16 12:16:30.868000 audit: BPF prog-id=27 op=UNLOAD Dec 16 12:16:30.876672 systemd[1]: Reload requested from client PID 1443 ('systemctl') (unit ensure-sysext.service)... Dec 16 12:16:30.876691 systemd[1]: Reloading... Dec 16 12:16:30.878417 systemd-tmpfiles[1444]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Dec 16 12:16:30.878456 systemd-tmpfiles[1444]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Dec 16 12:16:30.879568 systemd-tmpfiles[1444]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Dec 16 12:16:30.880522 systemd-tmpfiles[1444]: ACLs are not supported, ignoring. Dec 16 12:16:30.880580 systemd-tmpfiles[1444]: ACLs are not supported, ignoring. Dec 16 12:16:30.887210 systemd-tmpfiles[1444]: Detected autofs mount point /boot during canonicalization of boot. Dec 16 12:16:30.887224 systemd-tmpfiles[1444]: Skipping /boot Dec 16 12:16:30.887643 systemd-udevd[1445]: Using default interface naming scheme 'v257'. Dec 16 12:16:30.893955 systemd-tmpfiles[1444]: Detected autofs mount point /boot during canonicalization of boot. Dec 16 12:16:30.893967 systemd-tmpfiles[1444]: Skipping /boot Dec 16 12:16:30.933950 zram_generator::config[1478]: No configuration found. Dec 16 12:16:31.023851 kernel: mousedev: PS/2 mouse device common for all mice Dec 16 12:16:31.096177 kernel: [drm] pci: virtio-gpu-pci detected at 0000:06:00.0 Dec 16 12:16:31.096336 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Dec 16 12:16:31.096370 kernel: [drm] features: -context_init Dec 16 12:16:31.172835 kernel: [drm] number of scanouts: 1 Dec 16 12:16:31.174871 kernel: [drm] number of cap sets: 0 Dec 16 12:16:31.174917 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:06:00.0 on minor 0 Dec 16 12:16:31.174937 kernel: Console: switching to colour frame buffer device 160x50 Dec 16 12:16:31.175425 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Dec 16 12:16:31.176493 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Dec 16 12:16:31.188317 kernel: virtio-pci 0000:06:00.0: [drm] fb0: virtio_gpudrmfb frame buffer device Dec 16 12:16:31.188889 systemd[1]: Reloading finished in 311 ms. Dec 16 12:16:31.200909 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 12:16:31.201000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:31.204000 audit: BPF prog-id=43 op=LOAD Dec 16 12:16:31.204000 audit: BPF prog-id=44 op=LOAD Dec 16 12:16:31.204000 audit: BPF prog-id=28 op=UNLOAD Dec 16 12:16:31.204000 audit: BPF prog-id=29 op=UNLOAD Dec 16 12:16:31.205000 audit: BPF prog-id=45 op=LOAD Dec 16 12:16:31.205000 audit: BPF prog-id=30 op=UNLOAD Dec 16 12:16:31.206000 audit: BPF prog-id=46 op=LOAD Dec 16 12:16:31.206000 audit: BPF prog-id=40 op=UNLOAD Dec 16 12:16:31.206000 audit: BPF prog-id=47 op=LOAD Dec 16 12:16:31.206000 audit: BPF prog-id=48 op=LOAD Dec 16 12:16:31.206000 audit: BPF prog-id=41 op=UNLOAD Dec 16 12:16:31.206000 audit: BPF prog-id=42 op=UNLOAD Dec 16 12:16:31.206000 audit: BPF prog-id=49 op=LOAD Dec 16 12:16:31.206000 audit: BPF prog-id=34 op=UNLOAD Dec 16 12:16:31.206000 audit: BPF prog-id=50 op=LOAD Dec 16 12:16:31.206000 audit: BPF prog-id=51 op=LOAD Dec 16 12:16:31.206000 audit: BPF prog-id=35 op=UNLOAD Dec 16 12:16:31.206000 audit: BPF prog-id=36 op=UNLOAD Dec 16 12:16:31.207000 audit: BPF prog-id=52 op=LOAD Dec 16 12:16:31.207000 audit: BPF prog-id=31 op=UNLOAD Dec 16 12:16:31.207000 audit: BPF prog-id=53 op=LOAD Dec 16 12:16:31.207000 audit: BPF prog-id=54 op=LOAD Dec 16 12:16:31.207000 audit: BPF prog-id=32 op=UNLOAD Dec 16 12:16:31.207000 audit: BPF prog-id=33 op=UNLOAD Dec 16 12:16:31.208000 audit: BPF prog-id=55 op=LOAD Dec 16 12:16:31.208000 audit: BPF prog-id=37 op=UNLOAD Dec 16 12:16:31.208000 audit: BPF prog-id=56 op=LOAD Dec 16 12:16:31.208000 audit: BPF prog-id=57 op=LOAD Dec 16 12:16:31.208000 audit: BPF prog-id=38 op=UNLOAD Dec 16 12:16:31.208000 audit: BPF prog-id=39 op=UNLOAD Dec 16 12:16:31.227410 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 12:16:31.227000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:31.251890 systemd[1]: Finished ensure-sysext.service. Dec 16 12:16:31.251000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:31.269442 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 12:16:31.272059 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Dec 16 12:16:31.273068 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 12:16:31.288963 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 12:16:31.291052 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 16 12:16:31.294974 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 12:16:31.296731 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 12:16:31.298981 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Dec 16 12:16:31.300545 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 12:16:31.300666 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Dec 16 12:16:31.301529 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Dec 16 12:16:31.305014 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Dec 16 12:16:31.306531 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 12:16:31.307546 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Dec 16 12:16:31.308000 audit: BPF prog-id=58 op=LOAD Dec 16 12:16:31.312246 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 16 12:16:31.312722 kernel: pps_core: LinuxPPS API ver. 1 registered Dec 16 12:16:31.312766 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Dec 16 12:16:31.314838 systemd[1]: Reached target time-set.target - System Time Set. Dec 16 12:16:31.316829 kernel: PTP clock support registered Dec 16 12:16:31.321267 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Dec 16 12:16:31.323987 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:16:31.326612 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 12:16:31.327317 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 12:16:31.330000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:31.330000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:31.331213 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 16 12:16:31.331416 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 16 12:16:31.332000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:31.332000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:31.332950 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 12:16:31.333121 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 12:16:31.334000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:31.334000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:31.336299 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 12:16:31.336474 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 12:16:31.337000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:31.337000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:31.339000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:31.339000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:31.338119 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Dec 16 12:16:31.338296 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Dec 16 12:16:31.339693 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Dec 16 12:16:31.342000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:31.347792 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 16 12:16:31.347930 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 16 12:16:31.359733 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Dec 16 12:16:31.358000 audit[1589]: SYSTEM_BOOT pid=1589 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Dec 16 12:16:31.361000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:31.361000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Dec 16 12:16:31.361000 audit[1609]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffe6aca1e0 a2=420 a3=0 items=0 ppid=1565 pid=1609 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:31.361000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Dec 16 12:16:31.362751 augenrules[1609]: No rules Dec 16 12:16:31.363395 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 12:16:31.363632 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 12:16:31.367347 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Dec 16 12:16:31.403750 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:16:31.410287 systemd-networkd[1583]: lo: Link UP Dec 16 12:16:31.410296 systemd-networkd[1583]: lo: Gained carrier Dec 16 12:16:31.411430 systemd-networkd[1583]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:16:31.411440 systemd-networkd[1583]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 12:16:31.411488 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 16 12:16:31.412455 systemd-networkd[1583]: eth0: Link UP Dec 16 12:16:31.412597 systemd-networkd[1583]: eth0: Gained carrier Dec 16 12:16:31.412615 systemd-networkd[1583]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:16:31.413575 systemd[1]: Reached target network.target - Network. Dec 16 12:16:31.416223 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Dec 16 12:16:31.418225 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Dec 16 12:16:31.419617 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Dec 16 12:16:31.421360 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Dec 16 12:16:31.436919 systemd-networkd[1583]: eth0: DHCPv4 address 10.0.29.66/25, gateway 10.0.29.1 acquired from 10.0.29.1 Dec 16 12:16:31.442586 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Dec 16 12:16:31.774110 ldconfig[1578]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Dec 16 12:16:31.779926 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Dec 16 12:16:31.782715 systemd[1]: Starting systemd-update-done.service - Update is Completed... Dec 16 12:16:31.817385 systemd[1]: Finished systemd-update-done.service - Update is Completed. Dec 16 12:16:31.818703 systemd[1]: Reached target sysinit.target - System Initialization. Dec 16 12:16:31.819766 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Dec 16 12:16:31.820879 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Dec 16 12:16:31.822060 systemd[1]: Started logrotate.timer - Daily rotation of log files. Dec 16 12:16:31.823017 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Dec 16 12:16:31.824086 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Dec 16 12:16:31.825177 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Dec 16 12:16:31.826088 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Dec 16 12:16:31.827092 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Dec 16 12:16:31.827129 systemd[1]: Reached target paths.target - Path Units. Dec 16 12:16:31.827851 systemd[1]: Reached target timers.target - Timer Units. Dec 16 12:16:31.830887 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Dec 16 12:16:31.833091 systemd[1]: Starting docker.socket - Docker Socket for the API... Dec 16 12:16:31.838273 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Dec 16 12:16:31.839542 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Dec 16 12:16:31.840626 systemd[1]: Reached target ssh-access.target - SSH Access Available. Dec 16 12:16:31.846144 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Dec 16 12:16:31.847283 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Dec 16 12:16:31.848844 systemd[1]: Listening on docker.socket - Docker Socket for the API. Dec 16 12:16:31.849775 systemd[1]: Reached target sockets.target - Socket Units. Dec 16 12:16:31.850574 systemd[1]: Reached target basic.target - Basic System. Dec 16 12:16:31.851429 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Dec 16 12:16:31.851463 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Dec 16 12:16:31.853719 systemd[1]: Starting chronyd.service - NTP client/server... Dec 16 12:16:31.855421 systemd[1]: Starting containerd.service - containerd container runtime... Dec 16 12:16:31.857465 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Dec 16 12:16:31.859984 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Dec 16 12:16:31.861624 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Dec 16 12:16:31.864827 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:16:31.864778 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Dec 16 12:16:31.866983 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Dec 16 12:16:31.867795 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Dec 16 12:16:31.879679 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Dec 16 12:16:31.881782 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Dec 16 12:16:31.886941 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Dec 16 12:16:31.887182 jq[1635]: false Dec 16 12:16:31.890991 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Dec 16 12:16:31.894882 systemd[1]: Starting systemd-logind.service - User Login Management... Dec 16 12:16:31.895737 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Dec 16 12:16:31.896228 extend-filesystems[1637]: Found /dev/vda6 Dec 16 12:16:31.896087 chronyd[1629]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Dec 16 12:16:31.896240 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Dec 16 12:16:31.897082 systemd[1]: Starting update-engine.service - Update Engine... Dec 16 12:16:31.899266 chronyd[1629]: Loaded seccomp filter (level 2) Dec 16 12:16:31.899523 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Dec 16 12:16:31.901376 systemd[1]: Started chronyd.service - NTP client/server. Dec 16 12:16:31.902448 extend-filesystems[1637]: Found /dev/vda9 Dec 16 12:16:31.905655 extend-filesystems[1637]: Checking size of /dev/vda9 Dec 16 12:16:31.905860 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Dec 16 12:16:31.909086 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Dec 16 12:16:31.909356 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Dec 16 12:16:31.911509 jq[1650]: true Dec 16 12:16:31.919688 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Dec 16 12:16:31.920991 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Dec 16 12:16:31.921790 extend-filesystems[1637]: Resized partition /dev/vda9 Dec 16 12:16:31.932070 extend-filesystems[1676]: resize2fs 1.47.3 (8-Jul-2025) Dec 16 12:16:31.937045 systemd[1]: motdgen.service: Deactivated successfully. Dec 16 12:16:31.937294 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Dec 16 12:16:31.943825 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 11516923 blocks Dec 16 12:16:31.945211 update_engine[1649]: I20251216 12:16:31.943959 1649 main.cc:92] Flatcar Update Engine starting Dec 16 12:16:31.949549 jq[1664]: true Dec 16 12:16:31.962196 tar[1659]: linux-arm64/LICENSE Dec 16 12:16:31.962458 tar[1659]: linux-arm64/helm Dec 16 12:16:31.976879 dbus-daemon[1632]: [system] SELinux support is enabled Dec 16 12:16:31.979912 systemd[1]: Started dbus.service - D-Bus System Message Bus. Dec 16 12:16:31.983656 update_engine[1649]: I20251216 12:16:31.983582 1649 update_check_scheduler.cc:74] Next update check in 4m39s Dec 16 12:16:31.993291 systemd[1]: Started update-engine.service - Update Engine. Dec 16 12:16:31.995192 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Dec 16 12:16:31.995233 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Dec 16 12:16:31.996649 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Dec 16 12:16:31.997918 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Dec 16 12:16:32.000262 systemd[1]: Started locksmithd.service - Cluster reboot manager. Dec 16 12:16:32.010144 systemd-logind[1646]: New seat seat0. Dec 16 12:16:32.013900 systemd-logind[1646]: Watching system buttons on /dev/input/event0 (Power Button) Dec 16 12:16:32.014035 systemd-logind[1646]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Dec 16 12:16:32.014516 systemd[1]: Started systemd-logind.service - User Login Management. Dec 16 12:16:32.054246 locksmithd[1697]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Dec 16 12:16:32.079640 bash[1701]: Updated "/home/core/.ssh/authorized_keys" Dec 16 12:16:32.081792 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Dec 16 12:16:32.091165 systemd[1]: Starting sshkeys.service... Dec 16 12:16:32.111165 sshd_keygen[1657]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Dec 16 12:16:32.116055 containerd[1662]: time="2025-12-16T12:16:32Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Dec 16 12:16:32.117613 containerd[1662]: time="2025-12-16T12:16:32.117524240Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Dec 16 12:16:32.126364 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Dec 16 12:16:32.133088 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Dec 16 12:16:32.133438 containerd[1662]: time="2025-12-16T12:16:32.133380920Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.28µs" Dec 16 12:16:32.133438 containerd[1662]: time="2025-12-16T12:16:32.133421200Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Dec 16 12:16:32.133526 containerd[1662]: time="2025-12-16T12:16:32.133469280Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Dec 16 12:16:32.133526 containerd[1662]: time="2025-12-16T12:16:32.133481400Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Dec 16 12:16:32.133636 containerd[1662]: time="2025-12-16T12:16:32.133616640Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Dec 16 12:16:32.133661 containerd[1662]: time="2025-12-16T12:16:32.133638120Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 16 12:16:32.133709 containerd[1662]: time="2025-12-16T12:16:32.133692320Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 16 12:16:32.133736 containerd[1662]: time="2025-12-16T12:16:32.133708880Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 16 12:16:32.134041 containerd[1662]: time="2025-12-16T12:16:32.133991760Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 16 12:16:32.134041 containerd[1662]: time="2025-12-16T12:16:32.134026160Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 16 12:16:32.134041 containerd[1662]: time="2025-12-16T12:16:32.134040160Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 16 12:16:32.134117 containerd[1662]: time="2025-12-16T12:16:32.134048320Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Dec 16 12:16:32.134247 containerd[1662]: time="2025-12-16T12:16:32.134178080Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Dec 16 12:16:32.134247 containerd[1662]: time="2025-12-16T12:16:32.134197160Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Dec 16 12:16:32.134299 containerd[1662]: time="2025-12-16T12:16:32.134271760Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Dec 16 12:16:32.134454 containerd[1662]: time="2025-12-16T12:16:32.134433120Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 16 12:16:32.134488 containerd[1662]: time="2025-12-16T12:16:32.134471160Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 16 12:16:32.134488 containerd[1662]: time="2025-12-16T12:16:32.134482400Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Dec 16 12:16:32.134532 containerd[1662]: time="2025-12-16T12:16:32.134514720Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Dec 16 12:16:32.135220 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Dec 16 12:16:32.137605 containerd[1662]: time="2025-12-16T12:16:32.137233400Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Dec 16 12:16:32.137605 containerd[1662]: time="2025-12-16T12:16:32.137344400Z" level=info msg="metadata content store policy set" policy=shared Dec 16 12:16:32.149373 systemd[1]: Starting issuegen.service - Generate /run/issue... Dec 16 12:16:32.157910 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:16:32.161478 systemd[1]: issuegen.service: Deactivated successfully. Dec 16 12:16:32.161761 systemd[1]: Finished issuegen.service - Generate /run/issue. Dec 16 12:16:32.168237 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Dec 16 12:16:32.173802 containerd[1662]: time="2025-12-16T12:16:32.173589000Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Dec 16 12:16:32.173802 containerd[1662]: time="2025-12-16T12:16:32.173663440Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Dec 16 12:16:32.173802 containerd[1662]: time="2025-12-16T12:16:32.173761640Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Dec 16 12:16:32.173802 containerd[1662]: time="2025-12-16T12:16:32.173775960Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Dec 16 12:16:32.173802 containerd[1662]: time="2025-12-16T12:16:32.173791120Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Dec 16 12:16:32.173802 containerd[1662]: time="2025-12-16T12:16:32.173804120Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Dec 16 12:16:32.174102 containerd[1662]: time="2025-12-16T12:16:32.173838240Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Dec 16 12:16:32.174102 containerd[1662]: time="2025-12-16T12:16:32.173850320Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Dec 16 12:16:32.174102 containerd[1662]: time="2025-12-16T12:16:32.173862880Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Dec 16 12:16:32.174102 containerd[1662]: time="2025-12-16T12:16:32.173883000Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Dec 16 12:16:32.174102 containerd[1662]: time="2025-12-16T12:16:32.173895720Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Dec 16 12:16:32.174102 containerd[1662]: time="2025-12-16T12:16:32.173907040Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Dec 16 12:16:32.174102 containerd[1662]: time="2025-12-16T12:16:32.173916880Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Dec 16 12:16:32.174102 containerd[1662]: time="2025-12-16T12:16:32.173929240Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Dec 16 12:16:32.174102 containerd[1662]: time="2025-12-16T12:16:32.174080520Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Dec 16 12:16:32.174102 containerd[1662]: time="2025-12-16T12:16:32.174105440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Dec 16 12:16:32.174370 containerd[1662]: time="2025-12-16T12:16:32.174123640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Dec 16 12:16:32.174370 containerd[1662]: time="2025-12-16T12:16:32.174136640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Dec 16 12:16:32.174370 containerd[1662]: time="2025-12-16T12:16:32.174147600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Dec 16 12:16:32.174370 containerd[1662]: time="2025-12-16T12:16:32.174157400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Dec 16 12:16:32.174370 containerd[1662]: time="2025-12-16T12:16:32.174170120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Dec 16 12:16:32.174370 containerd[1662]: time="2025-12-16T12:16:32.174180280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Dec 16 12:16:32.174370 containerd[1662]: time="2025-12-16T12:16:32.174191320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Dec 16 12:16:32.174370 containerd[1662]: time="2025-12-16T12:16:32.174203720Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Dec 16 12:16:32.174370 containerd[1662]: time="2025-12-16T12:16:32.174213800Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Dec 16 12:16:32.174370 containerd[1662]: time="2025-12-16T12:16:32.174239840Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Dec 16 12:16:32.174370 containerd[1662]: time="2025-12-16T12:16:32.174283680Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Dec 16 12:16:32.174370 containerd[1662]: time="2025-12-16T12:16:32.174297800Z" level=info msg="Start snapshots syncer" Dec 16 12:16:32.174370 containerd[1662]: time="2025-12-16T12:16:32.174331480Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Dec 16 12:16:32.174675 containerd[1662]: time="2025-12-16T12:16:32.174561040Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Dec 16 12:16:32.174675 containerd[1662]: time="2025-12-16T12:16:32.174610080Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Dec 16 12:16:32.175385 containerd[1662]: time="2025-12-16T12:16:32.174653960Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Dec 16 12:16:32.175385 containerd[1662]: time="2025-12-16T12:16:32.174742760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Dec 16 12:16:32.175385 containerd[1662]: time="2025-12-16T12:16:32.174763520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Dec 16 12:16:32.175385 containerd[1662]: time="2025-12-16T12:16:32.174782280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Dec 16 12:16:32.175385 containerd[1662]: time="2025-12-16T12:16:32.174792880Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Dec 16 12:16:32.175385 containerd[1662]: time="2025-12-16T12:16:32.174805960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Dec 16 12:16:32.175385 containerd[1662]: time="2025-12-16T12:16:32.174852640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Dec 16 12:16:32.175385 containerd[1662]: time="2025-12-16T12:16:32.174864760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Dec 16 12:16:32.175385 containerd[1662]: time="2025-12-16T12:16:32.174875320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Dec 16 12:16:32.175385 containerd[1662]: time="2025-12-16T12:16:32.174892400Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Dec 16 12:16:32.175385 containerd[1662]: time="2025-12-16T12:16:32.174922040Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 16 12:16:32.175385 containerd[1662]: time="2025-12-16T12:16:32.174934240Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 16 12:16:32.175385 containerd[1662]: time="2025-12-16T12:16:32.174942480Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 16 12:16:32.175928 containerd[1662]: time="2025-12-16T12:16:32.174953520Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 16 12:16:32.175928 containerd[1662]: time="2025-12-16T12:16:32.174962160Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Dec 16 12:16:32.175928 containerd[1662]: time="2025-12-16T12:16:32.174972360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Dec 16 12:16:32.175928 containerd[1662]: time="2025-12-16T12:16:32.174982880Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Dec 16 12:16:32.175928 containerd[1662]: time="2025-12-16T12:16:32.175117240Z" level=info msg="runtime interface created" Dec 16 12:16:32.175928 containerd[1662]: time="2025-12-16T12:16:32.175123040Z" level=info msg="created NRI interface" Dec 16 12:16:32.175928 containerd[1662]: time="2025-12-16T12:16:32.175134200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Dec 16 12:16:32.175928 containerd[1662]: time="2025-12-16T12:16:32.175145200Z" level=info msg="Connect containerd service" Dec 16 12:16:32.175928 containerd[1662]: time="2025-12-16T12:16:32.175172400Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Dec 16 12:16:32.175928 containerd[1662]: time="2025-12-16T12:16:32.175778080Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 16 12:16:32.185955 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Dec 16 12:16:32.190235 systemd[1]: Started getty@tty1.service - Getty on tty1. Dec 16 12:16:32.194360 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Dec 16 12:16:32.195571 systemd[1]: Reached target getty.target - Login Prompts. Dec 16 12:16:32.211854 kernel: EXT4-fs (vda9): resized filesystem to 11516923 Dec 16 12:16:32.233843 extend-filesystems[1676]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Dec 16 12:16:32.233843 extend-filesystems[1676]: old_desc_blocks = 1, new_desc_blocks = 6 Dec 16 12:16:32.233843 extend-filesystems[1676]: The filesystem on /dev/vda9 is now 11516923 (4k) blocks long. Dec 16 12:16:32.240053 extend-filesystems[1637]: Resized filesystem in /dev/vda9 Dec 16 12:16:32.235550 systemd[1]: extend-filesystems.service: Deactivated successfully. Dec 16 12:16:32.235961 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Dec 16 12:16:32.259776 containerd[1662]: time="2025-12-16T12:16:32.259721800Z" level=info msg="Start subscribing containerd event" Dec 16 12:16:32.259937 containerd[1662]: time="2025-12-16T12:16:32.259876880Z" level=info msg="Start recovering state" Dec 16 12:16:32.260139 containerd[1662]: time="2025-12-16T12:16:32.260093600Z" level=info msg="Start event monitor" Dec 16 12:16:32.260199 containerd[1662]: time="2025-12-16T12:16:32.260123240Z" level=info msg="Start cni network conf syncer for default" Dec 16 12:16:32.260250 containerd[1662]: time="2025-12-16T12:16:32.260232320Z" level=info msg="Start streaming server" Dec 16 12:16:32.260347 containerd[1662]: time="2025-12-16T12:16:32.260331760Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Dec 16 12:16:32.260479 containerd[1662]: time="2025-12-16T12:16:32.260395160Z" level=info msg="runtime interface starting up..." Dec 16 12:16:32.260479 containerd[1662]: time="2025-12-16T12:16:32.260408160Z" level=info msg="starting plugins..." Dec 16 12:16:32.260479 containerd[1662]: time="2025-12-16T12:16:32.260426720Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Dec 16 12:16:32.260479 containerd[1662]: time="2025-12-16T12:16:32.260095040Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Dec 16 12:16:32.260560 containerd[1662]: time="2025-12-16T12:16:32.260537720Z" level=info msg=serving... address=/run/containerd/containerd.sock Dec 16 12:16:32.260885 systemd[1]: Started containerd.service - containerd container runtime. Dec 16 12:16:32.262628 containerd[1662]: time="2025-12-16T12:16:32.262304440Z" level=info msg="containerd successfully booted in 0.146774s" Dec 16 12:16:32.374647 tar[1659]: linux-arm64/README.md Dec 16 12:16:32.395566 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Dec 16 12:16:32.693132 systemd-networkd[1583]: eth0: Gained IPv6LL Dec 16 12:16:32.695639 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Dec 16 12:16:32.697350 systemd[1]: Reached target network-online.target - Network is Online. Dec 16 12:16:32.699639 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:16:32.701615 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Dec 16 12:16:32.732701 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Dec 16 12:16:32.881851 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:16:33.162862 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:16:33.487069 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:16:33.490979 (kubelet)[1773]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:16:33.954109 kubelet[1773]: E1216 12:16:33.954034 1773 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:16:33.956354 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:16:33.956491 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:16:33.957035 systemd[1]: kubelet.service: Consumed 694ms CPU time, 247M memory peak. Dec 16 12:16:34.889840 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:16:35.170836 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:16:36.901262 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Dec 16 12:16:36.903553 systemd[1]: Started sshd@0-10.0.29.66:22-139.178.68.195:49776.service - OpenSSH per-connection server daemon (139.178.68.195:49776). Dec 16 12:16:38.113842 sshd[1784]: Accepted publickey for core from 139.178.68.195 port 49776 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:16:38.148078 sshd-session[1784]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:16:38.154937 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Dec 16 12:16:38.155961 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Dec 16 12:16:38.160963 systemd-logind[1646]: New session 1 of user core. Dec 16 12:16:38.178881 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Dec 16 12:16:38.181511 systemd[1]: Starting user@500.service - User Manager for UID 500... Dec 16 12:16:38.197690 (systemd)[1794]: pam_unix(systemd-user:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:16:38.201348 systemd-logind[1646]: New session 2 of user core. Dec 16 12:16:38.350886 systemd[1794]: Queued start job for default target default.target. Dec 16 12:16:38.368057 systemd[1794]: Created slice app.slice - User Application Slice. Dec 16 12:16:38.368095 systemd[1794]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Dec 16 12:16:38.368107 systemd[1794]: Reached target paths.target - Paths. Dec 16 12:16:38.368161 systemd[1794]: Reached target timers.target - Timers. Dec 16 12:16:38.369419 systemd[1794]: Starting dbus.socket - D-Bus User Message Bus Socket... Dec 16 12:16:38.370229 systemd[1794]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Dec 16 12:16:38.380604 systemd[1794]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Dec 16 12:16:38.380773 systemd[1794]: Listening on dbus.socket - D-Bus User Message Bus Socket. Dec 16 12:16:38.380910 systemd[1794]: Reached target sockets.target - Sockets. Dec 16 12:16:38.380961 systemd[1794]: Reached target basic.target - Basic System. Dec 16 12:16:38.380997 systemd[1794]: Reached target default.target - Main User Target. Dec 16 12:16:38.381023 systemd[1794]: Startup finished in 174ms. Dec 16 12:16:38.381274 systemd[1]: Started user@500.service - User Manager for UID 500. Dec 16 12:16:38.382957 systemd[1]: Started session-1.scope - Session 1 of User core. Dec 16 12:16:38.895048 systemd[1]: Started sshd@1-10.0.29.66:22-139.178.68.195:49790.service - OpenSSH per-connection server daemon (139.178.68.195:49790). Dec 16 12:16:38.901107 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:16:38.912064 coreos-metadata[1631]: Dec 16 12:16:38.911 WARN failed to locate config-drive, using the metadata service API instead Dec 16 12:16:38.928276 coreos-metadata[1631]: Dec 16 12:16:38.928 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Dec 16 12:16:39.183839 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:16:39.193434 coreos-metadata[1722]: Dec 16 12:16:39.193 WARN failed to locate config-drive, using the metadata service API instead Dec 16 12:16:39.207155 coreos-metadata[1722]: Dec 16 12:16:39.207 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Dec 16 12:16:39.549804 coreos-metadata[1722]: Dec 16 12:16:39.549 INFO Fetch successful Dec 16 12:16:39.549804 coreos-metadata[1722]: Dec 16 12:16:39.549 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Dec 16 12:16:39.765900 sshd[1808]: Accepted publickey for core from 139.178.68.195 port 49790 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:16:39.767324 sshd-session[1808]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:16:39.772189 systemd-logind[1646]: New session 3 of user core. Dec 16 12:16:39.783052 systemd[1]: Started session-3.scope - Session 3 of User core. Dec 16 12:16:40.262578 sshd[1816]: Connection closed by 139.178.68.195 port 49790 Dec 16 12:16:40.263192 sshd-session[1808]: pam_unix(sshd:session): session closed for user core Dec 16 12:16:40.267244 systemd[1]: sshd@1-10.0.29.66:22-139.178.68.195:49790.service: Deactivated successfully. Dec 16 12:16:40.269005 systemd[1]: session-3.scope: Deactivated successfully. Dec 16 12:16:40.271014 systemd-logind[1646]: Session 3 logged out. Waiting for processes to exit. Dec 16 12:16:40.272054 systemd-logind[1646]: Removed session 3. Dec 16 12:16:40.443012 systemd[1]: Started sshd@2-10.0.29.66:22-139.178.68.195:48132.service - OpenSSH per-connection server daemon (139.178.68.195:48132). Dec 16 12:16:41.319901 sshd[1822]: Accepted publickey for core from 139.178.68.195 port 48132 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:16:41.321228 sshd-session[1822]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:16:41.325147 systemd-logind[1646]: New session 4 of user core. Dec 16 12:16:41.335040 systemd[1]: Started session-4.scope - Session 4 of User core. Dec 16 12:16:41.828286 sshd[1826]: Connection closed by 139.178.68.195 port 48132 Dec 16 12:16:41.828846 sshd-session[1822]: pam_unix(sshd:session): session closed for user core Dec 16 12:16:41.832836 systemd[1]: sshd@2-10.0.29.66:22-139.178.68.195:48132.service: Deactivated successfully. Dec 16 12:16:41.836328 systemd[1]: session-4.scope: Deactivated successfully. Dec 16 12:16:41.838394 systemd-logind[1646]: Session 4 logged out. Waiting for processes to exit. Dec 16 12:16:41.839567 systemd-logind[1646]: Removed session 4. Dec 16 12:16:41.854047 coreos-metadata[1631]: Dec 16 12:16:41.853 INFO Fetch successful Dec 16 12:16:41.854374 coreos-metadata[1631]: Dec 16 12:16:41.854 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Dec 16 12:16:42.015996 coreos-metadata[1722]: Dec 16 12:16:42.015 INFO Fetch successful Dec 16 12:16:42.028834 coreos-metadata[1631]: Dec 16 12:16:42.028 INFO Fetch successful Dec 16 12:16:42.028834 coreos-metadata[1631]: Dec 16 12:16:42.028 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Dec 16 12:16:42.039463 unknown[1722]: wrote ssh authorized keys file for user: core Dec 16 12:16:42.068103 update-ssh-keys[1832]: Updated "/home/core/.ssh/authorized_keys" Dec 16 12:16:42.069192 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Dec 16 12:16:42.071145 systemd[1]: Finished sshkeys.service. Dec 16 12:16:42.162577 coreos-metadata[1631]: Dec 16 12:16:42.162 INFO Fetch successful Dec 16 12:16:42.162577 coreos-metadata[1631]: Dec 16 12:16:42.162 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Dec 16 12:16:42.298462 coreos-metadata[1631]: Dec 16 12:16:42.298 INFO Fetch successful Dec 16 12:16:42.298462 coreos-metadata[1631]: Dec 16 12:16:42.298 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Dec 16 12:16:42.430837 coreos-metadata[1631]: Dec 16 12:16:42.430 INFO Fetch successful Dec 16 12:16:42.430837 coreos-metadata[1631]: Dec 16 12:16:42.430 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Dec 16 12:16:42.562380 coreos-metadata[1631]: Dec 16 12:16:42.562 INFO Fetch successful Dec 16 12:16:42.599171 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Dec 16 12:16:42.599622 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Dec 16 12:16:42.599754 systemd[1]: Reached target multi-user.target - Multi-User System. Dec 16 12:16:42.602879 systemd[1]: Startup finished in 2.395s (kernel) + 12.417s (initrd) + 13.112s (userspace) = 27.925s. Dec 16 12:16:44.120711 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Dec 16 12:16:44.122178 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:16:45.570948 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:16:45.574835 (kubelet)[1848]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:16:45.636235 kubelet[1848]: E1216 12:16:45.636179 1848 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:16:45.639394 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:16:45.639630 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:16:45.640279 systemd[1]: kubelet.service: Consumed 148ms CPU time, 106.1M memory peak. Dec 16 12:16:52.013750 systemd[1]: Started sshd@3-10.0.29.66:22-139.178.68.195:50864.service - OpenSSH per-connection server daemon (139.178.68.195:50864). Dec 16 12:16:52.908931 sshd[1857]: Accepted publickey for core from 139.178.68.195 port 50864 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:16:52.910316 sshd-session[1857]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:16:52.914847 systemd-logind[1646]: New session 5 of user core. Dec 16 12:16:52.925136 systemd[1]: Started session-5.scope - Session 5 of User core. Dec 16 12:16:53.424875 sshd[1861]: Connection closed by 139.178.68.195 port 50864 Dec 16 12:16:53.424703 sshd-session[1857]: pam_unix(sshd:session): session closed for user core Dec 16 12:16:53.428573 systemd[1]: sshd@3-10.0.29.66:22-139.178.68.195:50864.service: Deactivated successfully. Dec 16 12:16:53.432538 systemd[1]: session-5.scope: Deactivated successfully. Dec 16 12:16:53.434306 systemd-logind[1646]: Session 5 logged out. Waiting for processes to exit. Dec 16 12:16:53.435249 systemd-logind[1646]: Removed session 5. Dec 16 12:16:53.610527 systemd[1]: Started sshd@4-10.0.29.66:22-139.178.68.195:50866.service - OpenSSH per-connection server daemon (139.178.68.195:50866). Dec 16 12:16:54.515718 sshd[1867]: Accepted publickey for core from 139.178.68.195 port 50866 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:16:54.517101 sshd-session[1867]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:16:54.521193 systemd-logind[1646]: New session 6 of user core. Dec 16 12:16:54.529191 systemd[1]: Started session-6.scope - Session 6 of User core. Dec 16 12:16:55.027782 sshd[1871]: Connection closed by 139.178.68.195 port 50866 Dec 16 12:16:55.028249 sshd-session[1867]: pam_unix(sshd:session): session closed for user core Dec 16 12:16:55.033749 systemd[1]: sshd@4-10.0.29.66:22-139.178.68.195:50866.service: Deactivated successfully. Dec 16 12:16:55.035335 systemd[1]: session-6.scope: Deactivated successfully. Dec 16 12:16:55.036361 systemd-logind[1646]: Session 6 logged out. Waiting for processes to exit. Dec 16 12:16:55.037269 systemd-logind[1646]: Removed session 6. Dec 16 12:16:55.212147 systemd[1]: Started sshd@5-10.0.29.66:22-139.178.68.195:50878.service - OpenSSH per-connection server daemon (139.178.68.195:50878). Dec 16 12:16:55.694138 chronyd[1629]: Selected source PHC0 Dec 16 12:16:55.871323 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Dec 16 12:16:55.872819 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:16:56.009020 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:16:56.013446 (kubelet)[1888]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:16:56.134664 sshd[1877]: Accepted publickey for core from 139.178.68.195 port 50878 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:16:56.136413 sshd-session[1877]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:16:56.141615 systemd-logind[1646]: New session 7 of user core. Dec 16 12:16:56.155065 systemd[1]: Started session-7.scope - Session 7 of User core. Dec 16 12:16:56.692529 sshd[1896]: Connection closed by 139.178.68.195 port 50878 Dec 16 12:16:56.693037 sshd-session[1877]: pam_unix(sshd:session): session closed for user core Dec 16 12:16:56.697430 systemd[1]: sshd@5-10.0.29.66:22-139.178.68.195:50878.service: Deactivated successfully. Dec 16 12:16:56.699234 systemd[1]: session-7.scope: Deactivated successfully. Dec 16 12:16:56.700101 systemd-logind[1646]: Session 7 logged out. Waiting for processes to exit. Dec 16 12:16:56.702358 systemd-logind[1646]: Removed session 7. Dec 16 12:16:56.862126 kubelet[1888]: E1216 12:16:56.862048 1888 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:16:56.864417 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:16:56.864555 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:16:56.866976 systemd[1]: kubelet.service: Consumed 148ms CPU time, 107.6M memory peak. Dec 16 12:16:56.896594 systemd[1]: Started sshd@6-10.0.29.66:22-139.178.68.195:50890.service - OpenSSH per-connection server daemon (139.178.68.195:50890). Dec 16 12:16:57.849071 sshd[1903]: Accepted publickey for core from 139.178.68.195 port 50890 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:16:57.850559 sshd-session[1903]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:16:57.855918 systemd-logind[1646]: New session 8 of user core. Dec 16 12:16:57.866072 systemd[1]: Started session-8.scope - Session 8 of User core. Dec 16 12:16:58.223730 sudo[1908]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Dec 16 12:16:58.224056 sudo[1908]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:16:58.234975 sudo[1908]: pam_unix(sudo:session): session closed for user root Dec 16 12:16:58.403573 sshd[1907]: Connection closed by 139.178.68.195 port 50890 Dec 16 12:16:58.403424 sshd-session[1903]: pam_unix(sshd:session): session closed for user core Dec 16 12:16:58.407728 systemd[1]: sshd@6-10.0.29.66:22-139.178.68.195:50890.service: Deactivated successfully. Dec 16 12:16:58.409336 systemd[1]: session-8.scope: Deactivated successfully. Dec 16 12:16:58.411758 systemd-logind[1646]: Session 8 logged out. Waiting for processes to exit. Dec 16 12:16:58.412839 systemd-logind[1646]: Removed session 8. Dec 16 12:16:58.586332 systemd[1]: Started sshd@7-10.0.29.66:22-139.178.68.195:50906.service - OpenSSH per-connection server daemon (139.178.68.195:50906). Dec 16 12:16:59.487699 sshd[1915]: Accepted publickey for core from 139.178.68.195 port 50906 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:16:59.489144 sshd-session[1915]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:16:59.492838 systemd-logind[1646]: New session 9 of user core. Dec 16 12:16:59.501182 systemd[1]: Started session-9.scope - Session 9 of User core. Dec 16 12:16:59.833392 sudo[1921]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Dec 16 12:16:59.833667 sudo[1921]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:16:59.836577 sudo[1921]: pam_unix(sudo:session): session closed for user root Dec 16 12:16:59.842140 sudo[1920]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Dec 16 12:16:59.842385 sudo[1920]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:16:59.848904 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 12:16:59.885000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Dec 16 12:16:59.887086 augenrules[1945]: No rules Dec 16 12:16:59.887494 kernel: kauditd_printk_skb: 187 callbacks suppressed Dec 16 12:16:59.887539 kernel: audit: type=1305 audit(1765887419.885:231): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Dec 16 12:16:59.889514 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 12:16:59.889740 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 12:16:59.885000 audit[1945]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffc1c33d50 a2=420 a3=0 items=0 ppid=1926 pid=1945 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:59.893089 kernel: audit: type=1300 audit(1765887419.885:231): arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffc1c33d50 a2=420 a3=0 items=0 ppid=1926 pid=1945 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:59.893226 kernel: audit: type=1327 audit(1765887419.885:231): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Dec 16 12:16:59.885000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Dec 16 12:16:59.893521 sudo[1920]: pam_unix(sudo:session): session closed for user root Dec 16 12:16:59.889000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:59.896659 kernel: audit: type=1130 audit(1765887419.889:232): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:59.896751 kernel: audit: type=1131 audit(1765887419.889:233): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:59.889000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:16:59.892000 audit[1920]: USER_END pid=1920 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:16:59.901251 kernel: audit: type=1106 audit(1765887419.892:234): pid=1920 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:16:59.901321 kernel: audit: type=1104 audit(1765887419.892:235): pid=1920 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:16:59.892000 audit[1920]: CRED_DISP pid=1920 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:17:00.063923 sshd[1919]: Connection closed by 139.178.68.195 port 50906 Dec 16 12:17:00.064001 sshd-session[1915]: pam_unix(sshd:session): session closed for user core Dec 16 12:17:00.064000 audit[1915]: USER_END pid=1915 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:00.064000 audit[1915]: CRED_DISP pid=1915 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:00.070944 systemd[1]: sshd@7-10.0.29.66:22-139.178.68.195:50906.service: Deactivated successfully. Dec 16 12:17:00.072502 systemd[1]: session-9.scope: Deactivated successfully. Dec 16 12:17:00.073727 kernel: audit: type=1106 audit(1765887420.064:236): pid=1915 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:00.073773 kernel: audit: type=1104 audit(1765887420.064:237): pid=1915 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:00.073801 kernel: audit: type=1131 audit(1765887420.069:238): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.29.66:22-139.178.68.195:50906 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:00.069000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.29.66:22-139.178.68.195:50906 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:00.074320 systemd-logind[1646]: Session 9 logged out. Waiting for processes to exit. Dec 16 12:17:00.075103 systemd-logind[1646]: Removed session 9. Dec 16 12:17:00.249000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.29.66:22-139.178.68.195:50918 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:00.250353 systemd[1]: Started sshd@8-10.0.29.66:22-139.178.68.195:50918.service - OpenSSH per-connection server daemon (139.178.68.195:50918). Dec 16 12:17:01.125000 audit[1954]: USER_ACCT pid=1954 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:01.126705 sshd[1954]: Accepted publickey for core from 139.178.68.195 port 50918 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:17:01.126000 audit[1954]: CRED_ACQ pid=1954 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:01.126000 audit[1954]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff3aa9480 a2=3 a3=0 items=0 ppid=1 pid=1954 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:01.126000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:17:01.128002 sshd-session[1954]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:17:01.132567 systemd-logind[1646]: New session 10 of user core. Dec 16 12:17:01.139244 systemd[1]: Started session-10.scope - Session 10 of User core. Dec 16 12:17:01.140000 audit[1954]: USER_START pid=1954 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:01.142000 audit[1958]: CRED_ACQ pid=1958 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:01.467000 audit[1959]: USER_ACCT pid=1959 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:17:01.467000 audit[1959]: CRED_REFR pid=1959 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:17:01.468288 sudo[1959]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Dec 16 12:17:01.467000 audit[1959]: USER_START pid=1959 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:17:01.468545 sudo[1959]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:17:01.791152 systemd[1]: Starting docker.service - Docker Application Container Engine... Dec 16 12:17:01.804334 (dockerd)[1981]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Dec 16 12:17:02.044238 dockerd[1981]: time="2025-12-16T12:17:02.043427955Z" level=info msg="Starting up" Dec 16 12:17:02.044697 dockerd[1981]: time="2025-12-16T12:17:02.044671959Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Dec 16 12:17:02.055439 dockerd[1981]: time="2025-12-16T12:17:02.055360913Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Dec 16 12:17:02.083637 systemd[1]: var-lib-docker-metacopy\x2dcheck1282501534-merged.mount: Deactivated successfully. Dec 16 12:17:02.094495 dockerd[1981]: time="2025-12-16T12:17:02.094459078Z" level=info msg="Loading containers: start." Dec 16 12:17:02.104853 kernel: Initializing XFRM netlink socket Dec 16 12:17:02.149000 audit[2032]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=2032 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:02.149000 audit[2032]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=ffffdae00820 a2=0 a3=0 items=0 ppid=1981 pid=2032 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.149000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Dec 16 12:17:02.151000 audit[2034]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=2034 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:02.151000 audit[2034]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=ffffc84ad2a0 a2=0 a3=0 items=0 ppid=1981 pid=2034 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.151000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Dec 16 12:17:02.153000 audit[2036]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=2036 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:02.153000 audit[2036]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffcd8b75d0 a2=0 a3=0 items=0 ppid=1981 pid=2036 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.153000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Dec 16 12:17:02.154000 audit[2038]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=2038 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:02.154000 audit[2038]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff57e14f0 a2=0 a3=0 items=0 ppid=1981 pid=2038 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.154000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Dec 16 12:17:02.156000 audit[2040]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=2040 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:02.156000 audit[2040]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffc59b7c40 a2=0 a3=0 items=0 ppid=1981 pid=2040 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.156000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Dec 16 12:17:02.158000 audit[2042]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=2042 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:02.158000 audit[2042]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffed9def20 a2=0 a3=0 items=0 ppid=1981 pid=2042 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.158000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 12:17:02.160000 audit[2044]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=2044 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:02.160000 audit[2044]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffdba2d230 a2=0 a3=0 items=0 ppid=1981 pid=2044 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.160000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Dec 16 12:17:02.162000 audit[2046]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=2046 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:02.162000 audit[2046]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=ffffe25d1820 a2=0 a3=0 items=0 ppid=1981 pid=2046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.162000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Dec 16 12:17:02.192000 audit[2049]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=2049 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:02.192000 audit[2049]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=472 a0=3 a1=fffffbaa1240 a2=0 a3=0 items=0 ppid=1981 pid=2049 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.192000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Dec 16 12:17:02.193000 audit[2051]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=2051 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:02.193000 audit[2051]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffcafcd4a0 a2=0 a3=0 items=0 ppid=1981 pid=2051 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.193000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Dec 16 12:17:02.195000 audit[2053]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=2053 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:02.195000 audit[2053]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=ffffefb2a320 a2=0 a3=0 items=0 ppid=1981 pid=2053 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.195000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Dec 16 12:17:02.197000 audit[2055]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=2055 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:02.197000 audit[2055]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=ffffd5fe4e30 a2=0 a3=0 items=0 ppid=1981 pid=2055 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.197000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 12:17:02.199000 audit[2057]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=2057 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:02.199000 audit[2057]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=fffffb94f960 a2=0 a3=0 items=0 ppid=1981 pid=2057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.199000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Dec 16 12:17:02.233000 audit[2087]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2087 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:02.233000 audit[2087]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=ffffed69d610 a2=0 a3=0 items=0 ppid=1981 pid=2087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.233000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Dec 16 12:17:02.235000 audit[2089]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2089 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:02.235000 audit[2089]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=ffffc1d8dfa0 a2=0 a3=0 items=0 ppid=1981 pid=2089 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.235000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Dec 16 12:17:02.237000 audit[2091]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2091 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:02.237000 audit[2091]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd2fb4510 a2=0 a3=0 items=0 ppid=1981 pid=2091 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.237000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Dec 16 12:17:02.239000 audit[2093]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2093 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:02.239000 audit[2093]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffeef1b000 a2=0 a3=0 items=0 ppid=1981 pid=2093 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.239000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Dec 16 12:17:02.240000 audit[2095]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2095 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:02.240000 audit[2095]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffc37b55f0 a2=0 a3=0 items=0 ppid=1981 pid=2095 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.240000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Dec 16 12:17:02.242000 audit[2097]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2097 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:02.242000 audit[2097]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffc05f5880 a2=0 a3=0 items=0 ppid=1981 pid=2097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.242000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 12:17:02.244000 audit[2099]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2099 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:02.244000 audit[2099]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=fffff44be940 a2=0 a3=0 items=0 ppid=1981 pid=2099 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.244000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Dec 16 12:17:02.246000 audit[2101]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2101 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:02.246000 audit[2101]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=ffffeb95abd0 a2=0 a3=0 items=0 ppid=1981 pid=2101 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.246000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Dec 16 12:17:02.248000 audit[2103]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2103 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:02.248000 audit[2103]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=484 a0=3 a1=ffffe90d1ed0 a2=0 a3=0 items=0 ppid=1981 pid=2103 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.248000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Dec 16 12:17:02.249000 audit[2105]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2105 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:02.249000 audit[2105]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffcf964b00 a2=0 a3=0 items=0 ppid=1981 pid=2105 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.249000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Dec 16 12:17:02.251000 audit[2107]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2107 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:02.251000 audit[2107]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=fffff0697eb0 a2=0 a3=0 items=0 ppid=1981 pid=2107 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.251000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Dec 16 12:17:02.253000 audit[2109]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2109 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:02.253000 audit[2109]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=ffffdd05a400 a2=0 a3=0 items=0 ppid=1981 pid=2109 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.253000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 12:17:02.254000 audit[2111]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2111 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:02.254000 audit[2111]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=ffffd1d7e3b0 a2=0 a3=0 items=0 ppid=1981 pid=2111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.254000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Dec 16 12:17:02.259000 audit[2116]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2116 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:02.259000 audit[2116]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffc4de14d0 a2=0 a3=0 items=0 ppid=1981 pid=2116 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.259000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Dec 16 12:17:02.262000 audit[2118]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2118 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:02.262000 audit[2118]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=fffffc4bedf0 a2=0 a3=0 items=0 ppid=1981 pid=2118 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.262000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Dec 16 12:17:02.264000 audit[2120]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2120 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:02.264000 audit[2120]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffea7663e0 a2=0 a3=0 items=0 ppid=1981 pid=2120 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.264000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Dec 16 12:17:02.265000 audit[2122]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2122 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:02.265000 audit[2122]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=fffff0cbe110 a2=0 a3=0 items=0 ppid=1981 pid=2122 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.265000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Dec 16 12:17:02.267000 audit[2124]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2124 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:02.267000 audit[2124]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=fffff3df82d0 a2=0 a3=0 items=0 ppid=1981 pid=2124 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.267000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Dec 16 12:17:02.269000 audit[2126]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2126 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:02.269000 audit[2126]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffe5c6c7d0 a2=0 a3=0 items=0 ppid=1981 pid=2126 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.269000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Dec 16 12:17:02.285000 audit[2132]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2132 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:02.285000 audit[2132]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=520 a0=3 a1=ffffdfc53340 a2=0 a3=0 items=0 ppid=1981 pid=2132 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.285000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Dec 16 12:17:02.287000 audit[2134]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2134 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:02.287000 audit[2134]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=288 a0=3 a1=ffffc914b3d0 a2=0 a3=0 items=0 ppid=1981 pid=2134 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.287000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Dec 16 12:17:02.295000 audit[2142]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2142 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:02.295000 audit[2142]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=300 a0=3 a1=ffffc88be640 a2=0 a3=0 items=0 ppid=1981 pid=2142 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.295000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Dec 16 12:17:02.304000 audit[2148]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2148 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:02.304000 audit[2148]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=376 a0=3 a1=ffffe2897af0 a2=0 a3=0 items=0 ppid=1981 pid=2148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.304000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Dec 16 12:17:02.306000 audit[2150]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2150 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:02.306000 audit[2150]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=512 a0=3 a1=ffffc83ea4c0 a2=0 a3=0 items=0 ppid=1981 pid=2150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.306000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Dec 16 12:17:02.307000 audit[2152]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2152 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:02.307000 audit[2152]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=fffff2875950 a2=0 a3=0 items=0 ppid=1981 pid=2152 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.307000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Dec 16 12:17:02.309000 audit[2154]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2154 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:02.309000 audit[2154]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=428 a0=3 a1=fffff96af4d0 a2=0 a3=0 items=0 ppid=1981 pid=2154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.309000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Dec 16 12:17:02.310000 audit[2156]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2156 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:02.310000 audit[2156]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffe6802e00 a2=0 a3=0 items=0 ppid=1981 pid=2156 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:02.310000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Dec 16 12:17:02.312250 systemd-networkd[1583]: docker0: Link UP Dec 16 12:17:02.316368 dockerd[1981]: time="2025-12-16T12:17:02.316289629Z" level=info msg="Loading containers: done." Dec 16 12:17:02.327779 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2276721477-merged.mount: Deactivated successfully. Dec 16 12:17:02.334533 dockerd[1981]: time="2025-12-16T12:17:02.334479807Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Dec 16 12:17:02.334674 dockerd[1981]: time="2025-12-16T12:17:02.334568888Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Dec 16 12:17:02.334752 dockerd[1981]: time="2025-12-16T12:17:02.334733608Z" level=info msg="Initializing buildkit" Dec 16 12:17:02.355973 dockerd[1981]: time="2025-12-16T12:17:02.355940396Z" level=info msg="Completed buildkit initialization" Dec 16 12:17:02.360860 dockerd[1981]: time="2025-12-16T12:17:02.360395530Z" level=info msg="Daemon has completed initialization" Dec 16 12:17:02.360860 dockerd[1981]: time="2025-12-16T12:17:02.360465250Z" level=info msg="API listen on /run/docker.sock" Dec 16 12:17:02.361034 systemd[1]: Started docker.service - Docker Application Container Engine. Dec 16 12:17:02.360000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:03.096971 containerd[1662]: time="2025-12-16T12:17:03.096919609Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\"" Dec 16 12:17:03.757557 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3887544975.mount: Deactivated successfully. Dec 16 12:17:04.470660 containerd[1662]: time="2025-12-16T12:17:04.470604004Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:04.471549 containerd[1662]: time="2025-12-16T12:17:04.471507086Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.3: active requests=0, bytes read=22974850" Dec 16 12:17:04.472367 containerd[1662]: time="2025-12-16T12:17:04.472334169Z" level=info msg="ImageCreate event name:\"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:04.476308 containerd[1662]: time="2025-12-16T12:17:04.476227218Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:04.477321 containerd[1662]: time="2025-12-16T12:17:04.477199821Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.3\" with image id \"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.3\", repo digest \"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\", size \"24567639\" in 1.380235612s" Dec 16 12:17:04.477321 containerd[1662]: time="2025-12-16T12:17:04.477237421Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\" returns image reference \"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\"" Dec 16 12:17:04.477951 containerd[1662]: time="2025-12-16T12:17:04.477923863Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\"" Dec 16 12:17:05.769696 containerd[1662]: time="2025-12-16T12:17:05.769646156Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:05.771659 containerd[1662]: time="2025-12-16T12:17:05.771628082Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.3: active requests=0, bytes read=19127323" Dec 16 12:17:05.772892 containerd[1662]: time="2025-12-16T12:17:05.772845286Z" level=info msg="ImageCreate event name:\"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:05.778869 containerd[1662]: time="2025-12-16T12:17:05.776552978Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:05.779655 containerd[1662]: time="2025-12-16T12:17:05.779603108Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.3\" with image id \"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.3\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\", size \"20719958\" in 1.301642325s" Dec 16 12:17:05.779655 containerd[1662]: time="2025-12-16T12:17:05.779647788Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\" returns image reference \"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\"" Dec 16 12:17:05.780205 containerd[1662]: time="2025-12-16T12:17:05.780166990Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\"" Dec 16 12:17:06.870617 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Dec 16 12:17:06.872026 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:17:07.009752 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:17:07.009000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:07.012983 kernel: kauditd_printk_skb: 132 callbacks suppressed Dec 16 12:17:07.013063 kernel: audit: type=1130 audit(1765887427.009:289): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:07.013723 (kubelet)[2266]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:17:07.218118 kubelet[2266]: E1216 12:17:07.217992 2266 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:17:07.220779 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:17:07.220935 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:17:07.221000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 12:17:07.221953 systemd[1]: kubelet.service: Consumed 140ms CPU time, 109.6M memory peak. Dec 16 12:17:07.225853 kernel: audit: type=1131 audit(1765887427.221:290): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 12:17:07.620436 containerd[1662]: time="2025-12-16T12:17:07.620391284Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:07.621827 containerd[1662]: time="2025-12-16T12:17:07.621744488Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.3: active requests=0, bytes read=14183580" Dec 16 12:17:07.623200 containerd[1662]: time="2025-12-16T12:17:07.623166893Z" level=info msg="ImageCreate event name:\"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:07.625887 containerd[1662]: time="2025-12-16T12:17:07.625853781Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:07.626896 containerd[1662]: time="2025-12-16T12:17:07.626864265Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.3\" with image id \"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.3\", repo digest \"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\", size \"15776215\" in 1.846383994s" Dec 16 12:17:07.626994 containerd[1662]: time="2025-12-16T12:17:07.626978985Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\" returns image reference \"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\"" Dec 16 12:17:07.627572 containerd[1662]: time="2025-12-16T12:17:07.627547227Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\"" Dec 16 12:17:08.526917 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2163607883.mount: Deactivated successfully. Dec 16 12:17:08.694624 containerd[1662]: time="2025-12-16T12:17:08.694544564Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:08.696042 containerd[1662]: time="2025-12-16T12:17:08.695988649Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.3: active requests=0, bytes read=0" Dec 16 12:17:08.697839 containerd[1662]: time="2025-12-16T12:17:08.697801815Z" level=info msg="ImageCreate event name:\"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:08.699841 containerd[1662]: time="2025-12-16T12:17:08.699788781Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:08.700524 containerd[1662]: time="2025-12-16T12:17:08.700478623Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.3\" with image id \"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\", repo tag \"registry.k8s.io/kube-proxy:v1.34.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\", size \"22804272\" in 1.072818356s" Dec 16 12:17:08.700524 containerd[1662]: time="2025-12-16T12:17:08.700516983Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\" returns image reference \"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\"" Dec 16 12:17:08.701287 containerd[1662]: time="2025-12-16T12:17:08.701263186Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Dec 16 12:17:09.512611 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount619891175.mount: Deactivated successfully. Dec 16 12:17:09.973500 containerd[1662]: time="2025-12-16T12:17:09.973414140Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:09.974408 containerd[1662]: time="2025-12-16T12:17:09.974337703Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=19575910" Dec 16 12:17:09.975551 containerd[1662]: time="2025-12-16T12:17:09.975502347Z" level=info msg="ImageCreate event name:\"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:09.978139 containerd[1662]: time="2025-12-16T12:17:09.978095875Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:09.978949 containerd[1662]: time="2025-12-16T12:17:09.978913318Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"20392204\" in 1.277615612s" Dec 16 12:17:09.978994 containerd[1662]: time="2025-12-16T12:17:09.978949558Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\"" Dec 16 12:17:09.979630 containerd[1662]: time="2025-12-16T12:17:09.979464240Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Dec 16 12:17:10.502336 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1769057396.mount: Deactivated successfully. Dec 16 12:17:10.507824 containerd[1662]: time="2025-12-16T12:17:10.507760732Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:10.509081 containerd[1662]: time="2025-12-16T12:17:10.509032136Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=0" Dec 16 12:17:10.510100 containerd[1662]: time="2025-12-16T12:17:10.510054659Z" level=info msg="ImageCreate event name:\"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:10.512194 containerd[1662]: time="2025-12-16T12:17:10.512153266Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:10.512742 containerd[1662]: time="2025-12-16T12:17:10.512717068Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"267939\" in 533.225508ms" Dec 16 12:17:10.512794 containerd[1662]: time="2025-12-16T12:17:10.512747188Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\"" Dec 16 12:17:10.513194 containerd[1662]: time="2025-12-16T12:17:10.513171509Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\"" Dec 16 12:17:11.154916 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2053954614.mount: Deactivated successfully. Dec 16 12:17:14.450835 containerd[1662]: time="2025-12-16T12:17:14.449865360Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.4-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:14.451217 containerd[1662]: time="2025-12-16T12:17:14.451177444Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.4-0: active requests=0, bytes read=96314798" Dec 16 12:17:14.451862 containerd[1662]: time="2025-12-16T12:17:14.451833126Z" level=info msg="ImageCreate event name:\"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:14.456445 containerd[1662]: time="2025-12-16T12:17:14.456408621Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:14.457730 containerd[1662]: time="2025-12-16T12:17:14.457700105Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.4-0\" with image id \"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\", repo tag \"registry.k8s.io/etcd:3.6.4-0\", repo digest \"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\", size \"98207481\" in 3.944498356s" Dec 16 12:17:14.457730 containerd[1662]: time="2025-12-16T12:17:14.457733385Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\" returns image reference \"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\"" Dec 16 12:17:17.370600 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Dec 16 12:17:17.372899 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:17:17.671730 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:17:17.671000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:17.675833 kernel: audit: type=1130 audit(1765887437.671:291): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:17.689443 (kubelet)[2428]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:17:17.692900 update_engine[1649]: I20251216 12:17:17.692852 1649 update_attempter.cc:509] Updating boot flags... Dec 16 12:17:17.723453 kubelet[2428]: E1216 12:17:17.723404 2428 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:17:17.726203 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:17:17.726451 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:17:17.728966 systemd[1]: kubelet.service: Consumed 139ms CPU time, 108.3M memory peak. Dec 16 12:17:17.728000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 12:17:17.732883 kernel: audit: type=1131 audit(1765887437.728:292): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 12:17:19.357324 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:17:19.357488 systemd[1]: kubelet.service: Consumed 139ms CPU time, 108.3M memory peak. Dec 16 12:17:19.356000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:19.359443 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:17:19.356000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:19.362238 kernel: audit: type=1130 audit(1765887439.356:293): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:19.362336 kernel: audit: type=1131 audit(1765887439.356:294): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:19.386126 systemd[1]: Reload requested from client PID 2461 ('systemctl') (unit session-10.scope)... Dec 16 12:17:19.386144 systemd[1]: Reloading... Dec 16 12:17:19.460836 zram_generator::config[2504]: No configuration found. Dec 16 12:17:19.629283 systemd[1]: Reloading finished in 242 ms. Dec 16 12:17:19.653000 audit: BPF prog-id=63 op=LOAD Dec 16 12:17:19.653000 audit: BPF prog-id=58 op=UNLOAD Dec 16 12:17:19.656331 kernel: audit: type=1334 audit(1765887439.653:295): prog-id=63 op=LOAD Dec 16 12:17:19.656399 kernel: audit: type=1334 audit(1765887439.653:296): prog-id=58 op=UNLOAD Dec 16 12:17:19.656420 kernel: audit: type=1334 audit(1765887439.654:297): prog-id=64 op=LOAD Dec 16 12:17:19.656436 kernel: audit: type=1334 audit(1765887439.654:298): prog-id=46 op=UNLOAD Dec 16 12:17:19.656460 kernel: audit: type=1334 audit(1765887439.655:299): prog-id=65 op=LOAD Dec 16 12:17:19.656476 kernel: audit: type=1334 audit(1765887439.655:300): prog-id=66 op=LOAD Dec 16 12:17:19.654000 audit: BPF prog-id=64 op=LOAD Dec 16 12:17:19.654000 audit: BPF prog-id=46 op=UNLOAD Dec 16 12:17:19.655000 audit: BPF prog-id=65 op=LOAD Dec 16 12:17:19.655000 audit: BPF prog-id=66 op=LOAD Dec 16 12:17:19.655000 audit: BPF prog-id=47 op=UNLOAD Dec 16 12:17:19.655000 audit: BPF prog-id=48 op=UNLOAD Dec 16 12:17:19.656000 audit: BPF prog-id=67 op=LOAD Dec 16 12:17:19.657000 audit: BPF prog-id=68 op=LOAD Dec 16 12:17:19.657000 audit: BPF prog-id=43 op=UNLOAD Dec 16 12:17:19.657000 audit: BPF prog-id=44 op=UNLOAD Dec 16 12:17:19.658000 audit: BPF prog-id=69 op=LOAD Dec 16 12:17:19.658000 audit: BPF prog-id=52 op=UNLOAD Dec 16 12:17:19.659000 audit: BPF prog-id=70 op=LOAD Dec 16 12:17:19.659000 audit: BPF prog-id=71 op=LOAD Dec 16 12:17:19.659000 audit: BPF prog-id=53 op=UNLOAD Dec 16 12:17:19.659000 audit: BPF prog-id=54 op=UNLOAD Dec 16 12:17:19.660000 audit: BPF prog-id=72 op=LOAD Dec 16 12:17:19.671000 audit: BPF prog-id=49 op=UNLOAD Dec 16 12:17:19.671000 audit: BPF prog-id=73 op=LOAD Dec 16 12:17:19.671000 audit: BPF prog-id=74 op=LOAD Dec 16 12:17:19.671000 audit: BPF prog-id=50 op=UNLOAD Dec 16 12:17:19.671000 audit: BPF prog-id=51 op=UNLOAD Dec 16 12:17:19.672000 audit: BPF prog-id=75 op=LOAD Dec 16 12:17:19.672000 audit: BPF prog-id=60 op=UNLOAD Dec 16 12:17:19.673000 audit: BPF prog-id=76 op=LOAD Dec 16 12:17:19.673000 audit: BPF prog-id=77 op=LOAD Dec 16 12:17:19.673000 audit: BPF prog-id=61 op=UNLOAD Dec 16 12:17:19.673000 audit: BPF prog-id=62 op=UNLOAD Dec 16 12:17:19.673000 audit: BPF prog-id=78 op=LOAD Dec 16 12:17:19.673000 audit: BPF prog-id=59 op=UNLOAD Dec 16 12:17:19.674000 audit: BPF prog-id=79 op=LOAD Dec 16 12:17:19.674000 audit: BPF prog-id=45 op=UNLOAD Dec 16 12:17:19.675000 audit: BPF prog-id=80 op=LOAD Dec 16 12:17:19.675000 audit: BPF prog-id=55 op=UNLOAD Dec 16 12:17:19.675000 audit: BPF prog-id=81 op=LOAD Dec 16 12:17:19.675000 audit: BPF prog-id=82 op=LOAD Dec 16 12:17:19.675000 audit: BPF prog-id=56 op=UNLOAD Dec 16 12:17:19.675000 audit: BPF prog-id=57 op=UNLOAD Dec 16 12:17:19.695794 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Dec 16 12:17:19.695884 systemd[1]: kubelet.service: Failed with result 'signal'. Dec 16 12:17:19.695000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 12:17:19.696334 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:17:19.696385 systemd[1]: kubelet.service: Consumed 94ms CPU time, 95.4M memory peak. Dec 16 12:17:19.698881 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:17:20.458000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:20.459366 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:17:20.463654 (kubelet)[2555]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 16 12:17:20.494731 kubelet[2555]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 16 12:17:20.495056 kubelet[2555]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 12:17:20.616159 kubelet[2555]: I1216 12:17:20.616091 2555 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 12:17:23.509105 kubelet[2555]: I1216 12:17:23.509049 2555 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Dec 16 12:17:23.509105 kubelet[2555]: I1216 12:17:23.509083 2555 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 12:17:23.511512 kubelet[2555]: I1216 12:17:23.511457 2555 watchdog_linux.go:95] "Systemd watchdog is not enabled" Dec 16 12:17:23.511512 kubelet[2555]: I1216 12:17:23.511484 2555 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 16 12:17:23.511739 kubelet[2555]: I1216 12:17:23.511703 2555 server.go:956] "Client rotation is on, will bootstrap in background" Dec 16 12:17:23.518171 kubelet[2555]: E1216 12:17:23.518133 2555 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.29.66:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.29.66:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Dec 16 12:17:23.518932 kubelet[2555]: I1216 12:17:23.518902 2555 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 12:17:23.522347 kubelet[2555]: I1216 12:17:23.522304 2555 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 12:17:23.524894 kubelet[2555]: I1216 12:17:23.524839 2555 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Dec 16 12:17:23.525173 kubelet[2555]: I1216 12:17:23.525105 2555 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 12:17:23.525264 kubelet[2555]: I1216 12:17:23.525130 2555 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4547-0-0-5-b12717c6ea","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 12:17:23.525264 kubelet[2555]: I1216 12:17:23.525262 2555 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 12:17:23.525481 kubelet[2555]: I1216 12:17:23.525271 2555 container_manager_linux.go:306] "Creating device plugin manager" Dec 16 12:17:23.525481 kubelet[2555]: I1216 12:17:23.525356 2555 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Dec 16 12:17:23.528162 kubelet[2555]: I1216 12:17:23.528136 2555 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:17:23.530241 kubelet[2555]: I1216 12:17:23.530186 2555 kubelet.go:475] "Attempting to sync node with API server" Dec 16 12:17:23.530241 kubelet[2555]: I1216 12:17:23.530212 2555 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 12:17:23.530241 kubelet[2555]: I1216 12:17:23.530237 2555 kubelet.go:387] "Adding apiserver pod source" Dec 16 12:17:23.530241 kubelet[2555]: I1216 12:17:23.530250 2555 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 12:17:23.531133 kubelet[2555]: E1216 12:17:23.530797 2555 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.29.66:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4547-0-0-5-b12717c6ea&limit=500&resourceVersion=0\": dial tcp 10.0.29.66:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Dec 16 12:17:23.531133 kubelet[2555]: E1216 12:17:23.531094 2555 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.29.66:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.29.66:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Dec 16 12:17:23.531451 kubelet[2555]: I1216 12:17:23.531418 2555 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Dec 16 12:17:23.534359 kubelet[2555]: I1216 12:17:23.532376 2555 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 16 12:17:23.534359 kubelet[2555]: I1216 12:17:23.532417 2555 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Dec 16 12:17:23.534359 kubelet[2555]: W1216 12:17:23.532462 2555 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Dec 16 12:17:23.538068 kubelet[2555]: I1216 12:17:23.538045 2555 server.go:1262] "Started kubelet" Dec 16 12:17:23.538228 kubelet[2555]: I1216 12:17:23.538201 2555 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 12:17:23.538716 kubelet[2555]: I1216 12:17:23.538671 2555 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 12:17:23.538876 kubelet[2555]: I1216 12:17:23.538860 2555 server_v1.go:49] "podresources" method="list" useActivePods=true Dec 16 12:17:23.539030 kubelet[2555]: I1216 12:17:23.538999 2555 server.go:310] "Adding debug handlers to kubelet server" Dec 16 12:17:23.539227 kubelet[2555]: I1216 12:17:23.539209 2555 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 12:17:23.539319 kubelet[2555]: I1216 12:17:23.538885 2555 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 16 12:17:23.539434 kubelet[2555]: I1216 12:17:23.538700 2555 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 12:17:23.542295 kubelet[2555]: I1216 12:17:23.542273 2555 volume_manager.go:313] "Starting Kubelet Volume Manager" Dec 16 12:17:23.542541 kubelet[2555]: I1216 12:17:23.542367 2555 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 16 12:17:23.542541 kubelet[2555]: I1216 12:17:23.542431 2555 reconciler.go:29] "Reconciler: start to sync state" Dec 16 12:17:23.542780 kubelet[2555]: E1216 12:17:23.542745 2555 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.29.66:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.29.66:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Dec 16 12:17:23.543046 kubelet[2555]: I1216 12:17:23.542990 2555 factory.go:223] Registration of the systemd container factory successfully Dec 16 12:17:23.543330 kubelet[2555]: I1216 12:17:23.543092 2555 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 16 12:17:23.543330 kubelet[2555]: E1216 12:17:23.543302 2555 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4547-0-0-5-b12717c6ea\" not found" Dec 16 12:17:23.543417 kubelet[2555]: E1216 12:17:23.543379 2555 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.29.66:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-5-b12717c6ea?timeout=10s\": dial tcp 10.0.29.66:6443: connect: connection refused" interval="200ms" Dec 16 12:17:23.543925 kubelet[2555]: E1216 12:17:23.543898 2555 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 16 12:17:23.545083 kubelet[2555]: I1216 12:17:23.545053 2555 factory.go:223] Registration of the containerd container factory successfully Dec 16 12:17:23.548380 kernel: kauditd_printk_skb: 36 callbacks suppressed Dec 16 12:17:23.548464 kernel: audit: type=1325 audit(1765887443.545:337): table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2573 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:23.545000 audit[2573]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2573 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:23.545000 audit[2573]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffc68c5240 a2=0 a3=0 items=0 ppid=2555 pid=2573 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:23.552124 kubelet[2555]: E1216 12:17:23.547435 2555 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.29.66:6443/api/v1/namespaces/default/events\": dial tcp 10.0.29.66:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4547-0-0-5-b12717c6ea.1881b14619676c85 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4547-0-0-5-b12717c6ea,UID:ci-4547-0-0-5-b12717c6ea,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4547-0-0-5-b12717c6ea,},FirstTimestamp:2025-12-16 12:17:23.537980549 +0000 UTC m=+3.071481319,LastTimestamp:2025-12-16 12:17:23.537980549 +0000 UTC m=+3.071481319,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4547-0-0-5-b12717c6ea,}" Dec 16 12:17:23.545000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Dec 16 12:17:23.553764 kernel: audit: type=1300 audit(1765887443.545:337): arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffc68c5240 a2=0 a3=0 items=0 ppid=2555 pid=2573 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:23.553841 kernel: audit: type=1327 audit(1765887443.545:337): proctitle=69707461626C6573002D770035002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Dec 16 12:17:23.548000 audit[2574]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2574 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:23.555572 kernel: audit: type=1325 audit(1765887443.548:338): table=filter:43 family=2 entries=1 op=nft_register_chain pid=2574 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:23.548000 audit[2574]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffffce9a000 a2=0 a3=0 items=0 ppid=2555 pid=2574 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:23.559139 kernel: audit: type=1300 audit(1765887443.548:338): arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffffce9a000 a2=0 a3=0 items=0 ppid=2555 pid=2574 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:23.548000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4649524557414C4C002D740066696C746572 Dec 16 12:17:23.562825 kernel: audit: type=1327 audit(1765887443.548:338): proctitle=69707461626C6573002D770035002D4E004B5542452D4649524557414C4C002D740066696C746572 Dec 16 12:17:23.562882 kernel: audit: type=1325 audit(1765887443.550:339): table=filter:44 family=2 entries=2 op=nft_register_chain pid=2576 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:23.550000 audit[2576]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2576 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:23.562956 kubelet[2555]: I1216 12:17:23.560653 2555 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 16 12:17:23.562956 kubelet[2555]: I1216 12:17:23.560668 2555 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 16 12:17:23.562956 kubelet[2555]: I1216 12:17:23.560686 2555 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:17:23.550000 audit[2576]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=fffffa453e70 a2=0 a3=0 items=0 ppid=2555 pid=2576 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:23.564239 kubelet[2555]: I1216 12:17:23.564208 2555 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Dec 16 12:17:23.565794 kubelet[2555]: I1216 12:17:23.565487 2555 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Dec 16 12:17:23.565794 kubelet[2555]: I1216 12:17:23.565503 2555 status_manager.go:244] "Starting to sync pod status with apiserver" Dec 16 12:17:23.565794 kubelet[2555]: I1216 12:17:23.565542 2555 kubelet.go:2427] "Starting kubelet main sync loop" Dec 16 12:17:23.565794 kubelet[2555]: E1216 12:17:23.565580 2555 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 12:17:23.566523 kernel: audit: type=1300 audit(1765887443.550:339): arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=fffffa453e70 a2=0 a3=0 items=0 ppid=2555 pid=2576 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:23.566718 kubelet[2555]: E1216 12:17:23.566696 2555 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.29.66:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.29.66:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Dec 16 12:17:23.550000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:17:23.567771 kubelet[2555]: I1216 12:17:23.567757 2555 policy_none.go:49] "None policy: Start" Dec 16 12:17:23.567873 kubelet[2555]: I1216 12:17:23.567862 2555 memory_manager.go:187] "Starting memorymanager" policy="None" Dec 16 12:17:23.567962 kubelet[2555]: I1216 12:17:23.567953 2555 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Dec 16 12:17:23.568485 kernel: audit: type=1327 audit(1765887443.550:339): proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:17:23.568601 kernel: audit: type=1325 audit(1765887443.552:340): table=filter:45 family=2 entries=2 op=nft_register_chain pid=2578 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:23.552000 audit[2578]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2578 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:23.569672 kubelet[2555]: I1216 12:17:23.569628 2555 policy_none.go:47] "Start" Dec 16 12:17:23.552000 audit[2578]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffd8569330 a2=0 a3=0 items=0 ppid=2555 pid=2578 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:23.552000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:17:23.561000 audit[2581]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2581 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:23.561000 audit[2581]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=924 a0=3 a1=ffffe9648000 a2=0 a3=0 items=0 ppid=2555 pid=2581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:23.561000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F380000002D2D737263003132372E Dec 16 12:17:23.564000 audit[2585]: NETFILTER_CFG table=mangle:47 family=10 entries=2 op=nft_register_chain pid=2585 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:23.564000 audit[2585]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffca4340c0 a2=0 a3=0 items=0 ppid=2555 pid=2585 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:23.564000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Dec 16 12:17:23.564000 audit[2586]: NETFILTER_CFG table=mangle:48 family=2 entries=1 op=nft_register_chain pid=2586 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:23.564000 audit[2586]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffd0c7c950 a2=0 a3=0 items=0 ppid=2555 pid=2586 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:23.564000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Dec 16 12:17:23.565000 audit[2587]: NETFILTER_CFG table=mangle:49 family=10 entries=1 op=nft_register_chain pid=2587 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:23.565000 audit[2587]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff1fc3450 a2=0 a3=0 items=0 ppid=2555 pid=2587 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:23.565000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Dec 16 12:17:23.566000 audit[2588]: NETFILTER_CFG table=nat:50 family=2 entries=1 op=nft_register_chain pid=2588 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:23.566000 audit[2588]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd6f4eaa0 a2=0 a3=0 items=0 ppid=2555 pid=2588 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:23.566000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Dec 16 12:17:23.567000 audit[2589]: NETFILTER_CFG table=nat:51 family=10 entries=1 op=nft_register_chain pid=2589 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:23.567000 audit[2589]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffffee32550 a2=0 a3=0 items=0 ppid=2555 pid=2589 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:23.567000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Dec 16 12:17:23.567000 audit[2590]: NETFILTER_CFG table=filter:52 family=2 entries=1 op=nft_register_chain pid=2590 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:23.567000 audit[2590]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc4afdfb0 a2=0 a3=0 items=0 ppid=2555 pid=2590 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:23.567000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Dec 16 12:17:23.568000 audit[2591]: NETFILTER_CFG table=filter:53 family=10 entries=1 op=nft_register_chain pid=2591 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:23.568000 audit[2591]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff3aebf70 a2=0 a3=0 items=0 ppid=2555 pid=2591 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:23.568000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Dec 16 12:17:23.574430 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Dec 16 12:17:23.592651 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Dec 16 12:17:23.595121 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Dec 16 12:17:23.609737 kubelet[2555]: E1216 12:17:23.609713 2555 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 16 12:17:23.610038 kubelet[2555]: I1216 12:17:23.610019 2555 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 12:17:23.610147 kubelet[2555]: I1216 12:17:23.610108 2555 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 12:17:23.610986 kubelet[2555]: I1216 12:17:23.610939 2555 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 12:17:23.611524 kubelet[2555]: E1216 12:17:23.611499 2555 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 16 12:17:23.611580 kubelet[2555]: E1216 12:17:23.611541 2555 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4547-0-0-5-b12717c6ea\" not found" Dec 16 12:17:23.679671 systemd[1]: Created slice kubepods-burstable-podfb4ed8f3457eaa37af86999d43fb78e9.slice - libcontainer container kubepods-burstable-podfb4ed8f3457eaa37af86999d43fb78e9.slice. Dec 16 12:17:23.699048 kubelet[2555]: E1216 12:17:23.699002 2555 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-5-b12717c6ea\" not found" node="ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:23.701710 systemd[1]: Created slice kubepods-burstable-poda832406494b9bda518195f77cb4874f0.slice - libcontainer container kubepods-burstable-poda832406494b9bda518195f77cb4874f0.slice. Dec 16 12:17:23.712175 kubelet[2555]: I1216 12:17:23.712149 2555 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:23.712739 kubelet[2555]: E1216 12:17:23.712688 2555 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.29.66:6443/api/v1/nodes\": dial tcp 10.0.29.66:6443: connect: connection refused" node="ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:23.712830 kubelet[2555]: E1216 12:17:23.712796 2555 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-5-b12717c6ea\" not found" node="ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:23.715097 systemd[1]: Created slice kubepods-burstable-pod21d9a165cfb1b149e915c6ce7312572d.slice - libcontainer container kubepods-burstable-pod21d9a165cfb1b149e915c6ce7312572d.slice. Dec 16 12:17:23.716598 kubelet[2555]: E1216 12:17:23.716568 2555 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-5-b12717c6ea\" not found" node="ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:23.743350 kubelet[2555]: I1216 12:17:23.743267 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fb4ed8f3457eaa37af86999d43fb78e9-kubeconfig\") pod \"kube-scheduler-ci-4547-0-0-5-b12717c6ea\" (UID: \"fb4ed8f3457eaa37af86999d43fb78e9\") " pod="kube-system/kube-scheduler-ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:23.743943 kubelet[2555]: E1216 12:17:23.743904 2555 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.29.66:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-5-b12717c6ea?timeout=10s\": dial tcp 10.0.29.66:6443: connect: connection refused" interval="400ms" Dec 16 12:17:23.844902 kubelet[2555]: I1216 12:17:23.844414 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/21d9a165cfb1b149e915c6ce7312572d-ca-certs\") pod \"kube-controller-manager-ci-4547-0-0-5-b12717c6ea\" (UID: \"21d9a165cfb1b149e915c6ce7312572d\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:23.844902 kubelet[2555]: I1216 12:17:23.844495 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/21d9a165cfb1b149e915c6ce7312572d-flexvolume-dir\") pod \"kube-controller-manager-ci-4547-0-0-5-b12717c6ea\" (UID: \"21d9a165cfb1b149e915c6ce7312572d\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:23.844902 kubelet[2555]: I1216 12:17:23.844548 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/21d9a165cfb1b149e915c6ce7312572d-kubeconfig\") pod \"kube-controller-manager-ci-4547-0-0-5-b12717c6ea\" (UID: \"21d9a165cfb1b149e915c6ce7312572d\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:23.844902 kubelet[2555]: I1216 12:17:23.844599 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/21d9a165cfb1b149e915c6ce7312572d-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4547-0-0-5-b12717c6ea\" (UID: \"21d9a165cfb1b149e915c6ce7312572d\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:23.844902 kubelet[2555]: I1216 12:17:23.844677 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a832406494b9bda518195f77cb4874f0-ca-certs\") pod \"kube-apiserver-ci-4547-0-0-5-b12717c6ea\" (UID: \"a832406494b9bda518195f77cb4874f0\") " pod="kube-system/kube-apiserver-ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:23.845103 kubelet[2555]: I1216 12:17:23.844724 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/21d9a165cfb1b149e915c6ce7312572d-k8s-certs\") pod \"kube-controller-manager-ci-4547-0-0-5-b12717c6ea\" (UID: \"21d9a165cfb1b149e915c6ce7312572d\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:23.845194 kubelet[2555]: I1216 12:17:23.845178 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a832406494b9bda518195f77cb4874f0-k8s-certs\") pod \"kube-apiserver-ci-4547-0-0-5-b12717c6ea\" (UID: \"a832406494b9bda518195f77cb4874f0\") " pod="kube-system/kube-apiserver-ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:23.845261 kubelet[2555]: I1216 12:17:23.845249 2555 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a832406494b9bda518195f77cb4874f0-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4547-0-0-5-b12717c6ea\" (UID: \"a832406494b9bda518195f77cb4874f0\") " pod="kube-system/kube-apiserver-ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:23.915239 kubelet[2555]: I1216 12:17:23.915215 2555 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:23.915728 kubelet[2555]: E1216 12:17:23.915675 2555 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.29.66:6443/api/v1/nodes\": dial tcp 10.0.29.66:6443: connect: connection refused" node="ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:24.013133 containerd[1662]: time="2025-12-16T12:17:24.013078711Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4547-0-0-5-b12717c6ea,Uid:fb4ed8f3457eaa37af86999d43fb78e9,Namespace:kube-system,Attempt:0,}" Dec 16 12:17:24.026858 containerd[1662]: time="2025-12-16T12:17:24.026761194Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4547-0-0-5-b12717c6ea,Uid:a832406494b9bda518195f77cb4874f0,Namespace:kube-system,Attempt:0,}" Dec 16 12:17:24.028839 containerd[1662]: time="2025-12-16T12:17:24.028768921Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4547-0-0-5-b12717c6ea,Uid:21d9a165cfb1b149e915c6ce7312572d,Namespace:kube-system,Attempt:0,}" Dec 16 12:17:24.144877 kubelet[2555]: E1216 12:17:24.144774 2555 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.29.66:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-5-b12717c6ea?timeout=10s\": dial tcp 10.0.29.66:6443: connect: connection refused" interval="800ms" Dec 16 12:17:24.317608 kubelet[2555]: I1216 12:17:24.317583 2555 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:24.318115 kubelet[2555]: E1216 12:17:24.318085 2555 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.29.66:6443/api/v1/nodes\": dial tcp 10.0.29.66:6443: connect: connection refused" node="ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:24.340739 kubelet[2555]: E1216 12:17:24.340699 2555 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.29.66:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4547-0-0-5-b12717c6ea&limit=500&resourceVersion=0\": dial tcp 10.0.29.66:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Dec 16 12:17:24.422754 kubelet[2555]: E1216 12:17:24.422653 2555 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.29.66:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.29.66:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Dec 16 12:17:24.560376 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2586975775.mount: Deactivated successfully. Dec 16 12:17:24.565908 containerd[1662]: time="2025-12-16T12:17:24.565858441Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:17:24.567726 containerd[1662]: time="2025-12-16T12:17:24.567680127Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Dec 16 12:17:24.570620 containerd[1662]: time="2025-12-16T12:17:24.570564976Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:17:24.572727 containerd[1662]: time="2025-12-16T12:17:24.572684503Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:17:24.575140 containerd[1662]: time="2025-12-16T12:17:24.575102031Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Dec 16 12:17:24.576202 containerd[1662]: time="2025-12-16T12:17:24.576166314Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:17:24.577298 containerd[1662]: time="2025-12-16T12:17:24.577215438Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:17:24.578833 containerd[1662]: time="2025-12-16T12:17:24.577944440Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 543.16846ms" Dec 16 12:17:24.579014 containerd[1662]: time="2025-12-16T12:17:24.578970683Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Dec 16 12:17:24.581484 containerd[1662]: time="2025-12-16T12:17:24.581443451Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 552.431449ms" Dec 16 12:17:24.582236 containerd[1662]: time="2025-12-16T12:17:24.582208414Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 552.33453ms" Dec 16 12:17:24.599720 containerd[1662]: time="2025-12-16T12:17:24.599666390Z" level=info msg="connecting to shim 667f874a3429dacbfb72c1142c1aa79835cb9591b9ff2992196f6b65e04c7ab0" address="unix:///run/containerd/s/c8e23ec280ca2b97526a4348fe9aab16703804e0b2f89a273a970491b143fb6a" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:17:24.612475 containerd[1662]: time="2025-12-16T12:17:24.612393910Z" level=info msg="connecting to shim c7571e9caf86ac8e87d1a4daeca27a6452bd130354389622701fc29638c15921" address="unix:///run/containerd/s/ebb1b458ccb5e92099ccff951145529f6e571a252720826071a3841155ecebc7" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:17:24.615139 containerd[1662]: time="2025-12-16T12:17:24.615020159Z" level=info msg="connecting to shim 59c15d7387cbda02f31a180d709243bb48221642afa1da2773384a2bfab518ae" address="unix:///run/containerd/s/e4e52a02362517024b177fd16ec1546c58b0316891f3e2e8d94eae78e4bd83f0" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:17:24.628025 systemd[1]: Started cri-containerd-667f874a3429dacbfb72c1142c1aa79835cb9591b9ff2992196f6b65e04c7ab0.scope - libcontainer container 667f874a3429dacbfb72c1142c1aa79835cb9591b9ff2992196f6b65e04c7ab0. Dec 16 12:17:24.634767 systemd[1]: Started cri-containerd-59c15d7387cbda02f31a180d709243bb48221642afa1da2773384a2bfab518ae.scope - libcontainer container 59c15d7387cbda02f31a180d709243bb48221642afa1da2773384a2bfab518ae. Dec 16 12:17:24.635957 systemd[1]: Started cri-containerd-c7571e9caf86ac8e87d1a4daeca27a6452bd130354389622701fc29638c15921.scope - libcontainer container c7571e9caf86ac8e87d1a4daeca27a6452bd130354389622701fc29638c15921. Dec 16 12:17:24.640000 audit: BPF prog-id=83 op=LOAD Dec 16 12:17:24.641000 audit: BPF prog-id=84 op=LOAD Dec 16 12:17:24.641000 audit[2621]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001a8180 a2=98 a3=0 items=0 ppid=2605 pid=2621 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.641000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3636376638373461333432396461636266623732633131343263316161 Dec 16 12:17:24.641000 audit: BPF prog-id=84 op=UNLOAD Dec 16 12:17:24.641000 audit[2621]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2605 pid=2621 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.641000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3636376638373461333432396461636266623732633131343263316161 Dec 16 12:17:24.642000 audit: BPF prog-id=85 op=LOAD Dec 16 12:17:24.642000 audit[2621]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001a83e8 a2=98 a3=0 items=0 ppid=2605 pid=2621 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.642000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3636376638373461333432396461636266623732633131343263316161 Dec 16 12:17:24.642000 audit: BPF prog-id=86 op=LOAD Dec 16 12:17:24.642000 audit[2621]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=40001a8168 a2=98 a3=0 items=0 ppid=2605 pid=2621 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.642000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3636376638373461333432396461636266623732633131343263316161 Dec 16 12:17:24.643000 audit: BPF prog-id=86 op=UNLOAD Dec 16 12:17:24.643000 audit[2621]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2605 pid=2621 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.643000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3636376638373461333432396461636266623732633131343263316161 Dec 16 12:17:24.643000 audit: BPF prog-id=85 op=UNLOAD Dec 16 12:17:24.643000 audit[2621]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2605 pid=2621 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.643000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3636376638373461333432396461636266623732633131343263316161 Dec 16 12:17:24.643000 audit: BPF prog-id=87 op=LOAD Dec 16 12:17:24.643000 audit[2621]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001a8648 a2=98 a3=0 items=0 ppid=2605 pid=2621 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.643000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3636376638373461333432396461636266623732633131343263316161 Dec 16 12:17:24.648000 audit: BPF prog-id=88 op=LOAD Dec 16 12:17:24.649000 audit: BPF prog-id=89 op=LOAD Dec 16 12:17:24.649000 audit[2663]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=2630 pid=2663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.649000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337353731653963616638366163386538376431613464616563613237 Dec 16 12:17:24.649000 audit: BPF prog-id=89 op=UNLOAD Dec 16 12:17:24.649000 audit[2663]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2630 pid=2663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.649000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337353731653963616638366163386538376431613464616563613237 Dec 16 12:17:24.650000 audit: BPF prog-id=90 op=LOAD Dec 16 12:17:24.650000 audit[2663]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=2630 pid=2663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.650000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337353731653963616638366163386538376431613464616563613237 Dec 16 12:17:24.650000 audit: BPF prog-id=91 op=LOAD Dec 16 12:17:24.650000 audit[2663]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a0168 a2=98 a3=0 items=0 ppid=2630 pid=2663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.650000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337353731653963616638366163386538376431613464616563613237 Dec 16 12:17:24.650000 audit: BPF prog-id=91 op=UNLOAD Dec 16 12:17:24.650000 audit[2663]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2630 pid=2663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.650000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337353731653963616638366163386538376431613464616563613237 Dec 16 12:17:24.650000 audit: BPF prog-id=90 op=UNLOAD Dec 16 12:17:24.650000 audit[2663]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2630 pid=2663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.650000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337353731653963616638366163386538376431613464616563613237 Dec 16 12:17:24.650000 audit: BPF prog-id=92 op=LOAD Dec 16 12:17:24.650000 audit[2663]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0648 a2=98 a3=0 items=0 ppid=2630 pid=2663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.650000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6337353731653963616638366163386538376431613464616563613237 Dec 16 12:17:24.650000 audit: BPF prog-id=93 op=LOAD Dec 16 12:17:24.651000 audit: BPF prog-id=94 op=LOAD Dec 16 12:17:24.651000 audit[2665]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2640 pid=2665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.651000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539633135643733383763626461303266333161313830643730393234 Dec 16 12:17:24.651000 audit: BPF prog-id=94 op=UNLOAD Dec 16 12:17:24.651000 audit[2665]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2640 pid=2665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.651000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539633135643733383763626461303266333161313830643730393234 Dec 16 12:17:24.651000 audit: BPF prog-id=95 op=LOAD Dec 16 12:17:24.651000 audit[2665]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2640 pid=2665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.651000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539633135643733383763626461303266333161313830643730393234 Dec 16 12:17:24.651000 audit: BPF prog-id=96 op=LOAD Dec 16 12:17:24.651000 audit[2665]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2640 pid=2665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.651000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539633135643733383763626461303266333161313830643730393234 Dec 16 12:17:24.651000 audit: BPF prog-id=96 op=UNLOAD Dec 16 12:17:24.651000 audit[2665]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2640 pid=2665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.651000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539633135643733383763626461303266333161313830643730393234 Dec 16 12:17:24.651000 audit: BPF prog-id=95 op=UNLOAD Dec 16 12:17:24.651000 audit[2665]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2640 pid=2665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.651000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539633135643733383763626461303266333161313830643730393234 Dec 16 12:17:24.651000 audit: BPF prog-id=97 op=LOAD Dec 16 12:17:24.651000 audit[2665]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2640 pid=2665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.651000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539633135643733383763626461303266333161313830643730393234 Dec 16 12:17:24.682377 containerd[1662]: time="2025-12-16T12:17:24.681316771Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4547-0-0-5-b12717c6ea,Uid:21d9a165cfb1b149e915c6ce7312572d,Namespace:kube-system,Attempt:0,} returns sandbox id \"667f874a3429dacbfb72c1142c1aa79835cb9591b9ff2992196f6b65e04c7ab0\"" Dec 16 12:17:24.686427 containerd[1662]: time="2025-12-16T12:17:24.686267547Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4547-0-0-5-b12717c6ea,Uid:fb4ed8f3457eaa37af86999d43fb78e9,Namespace:kube-system,Attempt:0,} returns sandbox id \"59c15d7387cbda02f31a180d709243bb48221642afa1da2773384a2bfab518ae\"" Dec 16 12:17:24.688836 containerd[1662]: time="2025-12-16T12:17:24.688131953Z" level=info msg="CreateContainer within sandbox \"667f874a3429dacbfb72c1142c1aa79835cb9591b9ff2992196f6b65e04c7ab0\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Dec 16 12:17:24.688836 containerd[1662]: time="2025-12-16T12:17:24.688436714Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4547-0-0-5-b12717c6ea,Uid:a832406494b9bda518195f77cb4874f0,Namespace:kube-system,Attempt:0,} returns sandbox id \"c7571e9caf86ac8e87d1a4daeca27a6452bd130354389622701fc29638c15921\"" Dec 16 12:17:24.690250 containerd[1662]: time="2025-12-16T12:17:24.690223840Z" level=info msg="CreateContainer within sandbox \"59c15d7387cbda02f31a180d709243bb48221642afa1da2773384a2bfab518ae\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Dec 16 12:17:24.696096 containerd[1662]: time="2025-12-16T12:17:24.696056138Z" level=info msg="Container abcce0cf88f246b5c4a93f450c503fb28c508b775f7954abf3f7c6748cfcb1ec: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:17:24.702907 containerd[1662]: time="2025-12-16T12:17:24.702805040Z" level=info msg="CreateContainer within sandbox \"c7571e9caf86ac8e87d1a4daeca27a6452bd130354389622701fc29638c15921\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Dec 16 12:17:24.703560 containerd[1662]: time="2025-12-16T12:17:24.703535562Z" level=info msg="Container 10a46cef7f90c7580a77ec7626e9390580b228e9b6aecf4e3a69135da12ce63c: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:17:24.708607 containerd[1662]: time="2025-12-16T12:17:24.708572538Z" level=info msg="CreateContainer within sandbox \"667f874a3429dacbfb72c1142c1aa79835cb9591b9ff2992196f6b65e04c7ab0\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"abcce0cf88f246b5c4a93f450c503fb28c508b775f7954abf3f7c6748cfcb1ec\"" Dec 16 12:17:24.709480 containerd[1662]: time="2025-12-16T12:17:24.709455581Z" level=info msg="StartContainer for \"abcce0cf88f246b5c4a93f450c503fb28c508b775f7954abf3f7c6748cfcb1ec\"" Dec 16 12:17:24.710856 containerd[1662]: time="2025-12-16T12:17:24.710792545Z" level=info msg="connecting to shim abcce0cf88f246b5c4a93f450c503fb28c508b775f7954abf3f7c6748cfcb1ec" address="unix:///run/containerd/s/c8e23ec280ca2b97526a4348fe9aab16703804e0b2f89a273a970491b143fb6a" protocol=ttrpc version=3 Dec 16 12:17:24.714657 containerd[1662]: time="2025-12-16T12:17:24.713414194Z" level=info msg="CreateContainer within sandbox \"59c15d7387cbda02f31a180d709243bb48221642afa1da2773384a2bfab518ae\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"10a46cef7f90c7580a77ec7626e9390580b228e9b6aecf4e3a69135da12ce63c\"" Dec 16 12:17:24.714657 containerd[1662]: time="2025-12-16T12:17:24.713930756Z" level=info msg="StartContainer for \"10a46cef7f90c7580a77ec7626e9390580b228e9b6aecf4e3a69135da12ce63c\"" Dec 16 12:17:24.715288 containerd[1662]: time="2025-12-16T12:17:24.715251880Z" level=info msg="connecting to shim 10a46cef7f90c7580a77ec7626e9390580b228e9b6aecf4e3a69135da12ce63c" address="unix:///run/containerd/s/e4e52a02362517024b177fd16ec1546c58b0316891f3e2e8d94eae78e4bd83f0" protocol=ttrpc version=3 Dec 16 12:17:24.716208 containerd[1662]: time="2025-12-16T12:17:24.716175163Z" level=info msg="Container d85398d5cc90a57d0b11726bb0db68b07ea0a1532d61e82a13ae75bac5377991: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:17:24.729778 containerd[1662]: time="2025-12-16T12:17:24.729738646Z" level=info msg="CreateContainer within sandbox \"c7571e9caf86ac8e87d1a4daeca27a6452bd130354389622701fc29638c15921\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"d85398d5cc90a57d0b11726bb0db68b07ea0a1532d61e82a13ae75bac5377991\"" Dec 16 12:17:24.730668 containerd[1662]: time="2025-12-16T12:17:24.730634809Z" level=info msg="StartContainer for \"d85398d5cc90a57d0b11726bb0db68b07ea0a1532d61e82a13ae75bac5377991\"" Dec 16 12:17:24.731691 containerd[1662]: time="2025-12-16T12:17:24.731649572Z" level=info msg="connecting to shim d85398d5cc90a57d0b11726bb0db68b07ea0a1532d61e82a13ae75bac5377991" address="unix:///run/containerd/s/ebb1b458ccb5e92099ccff951145529f6e571a252720826071a3841155ecebc7" protocol=ttrpc version=3 Dec 16 12:17:24.732013 systemd[1]: Started cri-containerd-abcce0cf88f246b5c4a93f450c503fb28c508b775f7954abf3f7c6748cfcb1ec.scope - libcontainer container abcce0cf88f246b5c4a93f450c503fb28c508b775f7954abf3f7c6748cfcb1ec. Dec 16 12:17:24.745001 systemd[1]: Started cri-containerd-10a46cef7f90c7580a77ec7626e9390580b228e9b6aecf4e3a69135da12ce63c.scope - libcontainer container 10a46cef7f90c7580a77ec7626e9390580b228e9b6aecf4e3a69135da12ce63c. Dec 16 12:17:24.746000 audit: BPF prog-id=98 op=LOAD Dec 16 12:17:24.747000 audit: BPF prog-id=99 op=LOAD Dec 16 12:17:24.747000 audit[2734]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2605 pid=2734 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.747000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6162636365306366383866323436623563346139336634353063353033 Dec 16 12:17:24.747000 audit: BPF prog-id=99 op=UNLOAD Dec 16 12:17:24.747000 audit[2734]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2605 pid=2734 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.747000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6162636365306366383866323436623563346139336634353063353033 Dec 16 12:17:24.748000 audit: BPF prog-id=100 op=LOAD Dec 16 12:17:24.748000 audit[2734]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2605 pid=2734 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.748000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6162636365306366383866323436623563346139336634353063353033 Dec 16 12:17:24.749000 audit: BPF prog-id=101 op=LOAD Dec 16 12:17:24.749000 audit[2734]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2605 pid=2734 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.749000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6162636365306366383866323436623563346139336634353063353033 Dec 16 12:17:24.749000 audit: BPF prog-id=101 op=UNLOAD Dec 16 12:17:24.749000 audit[2734]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2605 pid=2734 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.749000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6162636365306366383866323436623563346139336634353063353033 Dec 16 12:17:24.749000 audit: BPF prog-id=100 op=UNLOAD Dec 16 12:17:24.749000 audit[2734]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2605 pid=2734 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.749000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6162636365306366383866323436623563346139336634353063353033 Dec 16 12:17:24.749000 audit: BPF prog-id=102 op=LOAD Dec 16 12:17:24.749000 audit[2734]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2605 pid=2734 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.749000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6162636365306366383866323436623563346139336634353063353033 Dec 16 12:17:24.750794 systemd[1]: Started cri-containerd-d85398d5cc90a57d0b11726bb0db68b07ea0a1532d61e82a13ae75bac5377991.scope - libcontainer container d85398d5cc90a57d0b11726bb0db68b07ea0a1532d61e82a13ae75bac5377991. Dec 16 12:17:24.756000 audit: BPF prog-id=103 op=LOAD Dec 16 12:17:24.758000 audit: BPF prog-id=104 op=LOAD Dec 16 12:17:24.758000 audit[2740]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=2640 pid=2740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.758000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3130613436636566376639306337353830613737656337363236653933 Dec 16 12:17:24.758000 audit: BPF prog-id=104 op=UNLOAD Dec 16 12:17:24.758000 audit[2740]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2640 pid=2740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.758000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3130613436636566376639306337353830613737656337363236653933 Dec 16 12:17:24.759000 audit: BPF prog-id=105 op=LOAD Dec 16 12:17:24.759000 audit[2740]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=2640 pid=2740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.759000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3130613436636566376639306337353830613737656337363236653933 Dec 16 12:17:24.759000 audit: BPF prog-id=106 op=LOAD Dec 16 12:17:24.759000 audit[2740]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=2640 pid=2740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.759000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3130613436636566376639306337353830613737656337363236653933 Dec 16 12:17:24.759000 audit: BPF prog-id=106 op=UNLOAD Dec 16 12:17:24.759000 audit[2740]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2640 pid=2740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.759000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3130613436636566376639306337353830613737656337363236653933 Dec 16 12:17:24.759000 audit: BPF prog-id=105 op=UNLOAD Dec 16 12:17:24.759000 audit[2740]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2640 pid=2740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.759000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3130613436636566376639306337353830613737656337363236653933 Dec 16 12:17:24.759000 audit: BPF prog-id=107 op=LOAD Dec 16 12:17:24.759000 audit[2740]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=2640 pid=2740 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.759000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3130613436636566376639306337353830613737656337363236653933 Dec 16 12:17:24.767000 audit: BPF prog-id=108 op=LOAD Dec 16 12:17:24.767000 audit: BPF prog-id=109 op=LOAD Dec 16 12:17:24.767000 audit[2759]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2630 pid=2759 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.767000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438353339386435636339306135376430623131373236626230646236 Dec 16 12:17:24.767000 audit: BPF prog-id=109 op=UNLOAD Dec 16 12:17:24.767000 audit[2759]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2630 pid=2759 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.767000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438353339386435636339306135376430623131373236626230646236 Dec 16 12:17:24.767000 audit: BPF prog-id=110 op=LOAD Dec 16 12:17:24.767000 audit[2759]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2630 pid=2759 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.767000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438353339386435636339306135376430623131373236626230646236 Dec 16 12:17:24.767000 audit: BPF prog-id=111 op=LOAD Dec 16 12:17:24.767000 audit[2759]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2630 pid=2759 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.767000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438353339386435636339306135376430623131373236626230646236 Dec 16 12:17:24.767000 audit: BPF prog-id=111 op=UNLOAD Dec 16 12:17:24.767000 audit[2759]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2630 pid=2759 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.767000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438353339386435636339306135376430623131373236626230646236 Dec 16 12:17:24.767000 audit: BPF prog-id=110 op=UNLOAD Dec 16 12:17:24.767000 audit[2759]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2630 pid=2759 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.767000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438353339386435636339306135376430623131373236626230646236 Dec 16 12:17:24.768000 audit: BPF prog-id=112 op=LOAD Dec 16 12:17:24.768000 audit[2759]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2630 pid=2759 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:24.768000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438353339386435636339306135376430623131373236626230646236 Dec 16 12:17:24.778963 kubelet[2555]: E1216 12:17:24.778920 2555 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.29.66:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.29.66:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Dec 16 12:17:24.786052 containerd[1662]: time="2025-12-16T12:17:24.786010386Z" level=info msg="StartContainer for \"abcce0cf88f246b5c4a93f450c503fb28c508b775f7954abf3f7c6748cfcb1ec\" returns successfully" Dec 16 12:17:24.798886 containerd[1662]: time="2025-12-16T12:17:24.798516746Z" level=info msg="StartContainer for \"10a46cef7f90c7580a77ec7626e9390580b228e9b6aecf4e3a69135da12ce63c\" returns successfully" Dec 16 12:17:24.807202 containerd[1662]: time="2025-12-16T12:17:24.807143894Z" level=info msg="StartContainer for \"d85398d5cc90a57d0b11726bb0db68b07ea0a1532d61e82a13ae75bac5377991\" returns successfully" Dec 16 12:17:25.122398 kubelet[2555]: I1216 12:17:25.122349 2555 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:25.576182 kubelet[2555]: E1216 12:17:25.576090 2555 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-5-b12717c6ea\" not found" node="ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:25.583260 kubelet[2555]: E1216 12:17:25.583162 2555 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-5-b12717c6ea\" not found" node="ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:25.585353 kubelet[2555]: E1216 12:17:25.585332 2555 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-5-b12717c6ea\" not found" node="ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:26.482680 kubelet[2555]: E1216 12:17:26.482628 2555 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4547-0-0-5-b12717c6ea\" not found" node="ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:26.562513 kubelet[2555]: I1216 12:17:26.562334 2555 kubelet_node_status.go:78] "Successfully registered node" node="ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:26.562513 kubelet[2555]: E1216 12:17:26.562374 2555 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"ci-4547-0-0-5-b12717c6ea\": node \"ci-4547-0-0-5-b12717c6ea\" not found" Dec 16 12:17:26.573618 kubelet[2555]: E1216 12:17:26.573303 2555 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4547-0-0-5-b12717c6ea\" not found" Dec 16 12:17:26.588841 kubelet[2555]: E1216 12:17:26.588159 2555 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-5-b12717c6ea\" not found" node="ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:26.588841 kubelet[2555]: E1216 12:17:26.588462 2555 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-5-b12717c6ea\" not found" node="ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:26.588841 kubelet[2555]: E1216 12:17:26.588733 2555 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-5-b12717c6ea\" not found" node="ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:26.598794 kubelet[2555]: E1216 12:17:26.598670 2555 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{ci-4547-0-0-5-b12717c6ea.1881b14619676c85 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4547-0-0-5-b12717c6ea,UID:ci-4547-0-0-5-b12717c6ea,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4547-0-0-5-b12717c6ea,},FirstTimestamp:2025-12-16 12:17:23.537980549 +0000 UTC m=+3.071481319,LastTimestamp:2025-12-16 12:17:23.537980549 +0000 UTC m=+3.071481319,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4547-0-0-5-b12717c6ea,}" Dec 16 12:17:26.673984 kubelet[2555]: E1216 12:17:26.673950 2555 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4547-0-0-5-b12717c6ea\" not found" Dec 16 12:17:26.774632 kubelet[2555]: E1216 12:17:26.774209 2555 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4547-0-0-5-b12717c6ea\" not found" Dec 16 12:17:26.875000 kubelet[2555]: E1216 12:17:26.874957 2555 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4547-0-0-5-b12717c6ea\" not found" Dec 16 12:17:26.975672 kubelet[2555]: E1216 12:17:26.975630 2555 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4547-0-0-5-b12717c6ea\" not found" Dec 16 12:17:27.076859 kubelet[2555]: E1216 12:17:27.076709 2555 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4547-0-0-5-b12717c6ea\" not found" Dec 16 12:17:27.244306 kubelet[2555]: I1216 12:17:27.244218 2555 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:27.249778 kubelet[2555]: E1216 12:17:27.249748 2555 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4547-0-0-5-b12717c6ea\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:27.249778 kubelet[2555]: I1216 12:17:27.249775 2555 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:27.251385 kubelet[2555]: E1216 12:17:27.251362 2555 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4547-0-0-5-b12717c6ea\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:27.251385 kubelet[2555]: I1216 12:17:27.251387 2555 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:27.252812 kubelet[2555]: E1216 12:17:27.252784 2555 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4547-0-0-5-b12717c6ea\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:27.532796 kubelet[2555]: I1216 12:17:27.532744 2555 apiserver.go:52] "Watching apiserver" Dec 16 12:17:27.543368 kubelet[2555]: I1216 12:17:27.543331 2555 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 16 12:17:28.657098 systemd[1]: Reload requested from client PID 2840 ('systemctl') (unit session-10.scope)... Dec 16 12:17:28.657118 systemd[1]: Reloading... Dec 16 12:17:28.722837 zram_generator::config[2886]: No configuration found. Dec 16 12:17:28.923307 systemd[1]: Reloading finished in 265 ms. Dec 16 12:17:28.950204 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:17:28.962905 systemd[1]: kubelet.service: Deactivated successfully. Dec 16 12:17:28.963186 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:17:28.962000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:28.963924 kernel: kauditd_printk_skb: 158 callbacks suppressed Dec 16 12:17:28.963958 kernel: audit: type=1131 audit(1765887448.962:397): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:28.963264 systemd[1]: kubelet.service: Consumed 3.310s CPU time, 121.7M memory peak. Dec 16 12:17:28.965049 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:17:28.965000 audit: BPF prog-id=113 op=LOAD Dec 16 12:17:28.967146 kernel: audit: type=1334 audit(1765887448.965:398): prog-id=113 op=LOAD Dec 16 12:17:28.967187 kernel: audit: type=1334 audit(1765887448.965:399): prog-id=75 op=UNLOAD Dec 16 12:17:28.965000 audit: BPF prog-id=75 op=UNLOAD Dec 16 12:17:28.967858 kernel: audit: type=1334 audit(1765887448.966:400): prog-id=114 op=LOAD Dec 16 12:17:28.966000 audit: BPF prog-id=114 op=LOAD Dec 16 12:17:28.967000 audit: BPF prog-id=115 op=LOAD Dec 16 12:17:28.967000 audit: BPF prog-id=76 op=UNLOAD Dec 16 12:17:28.967000 audit: BPF prog-id=77 op=UNLOAD Dec 16 12:17:28.967000 audit: BPF prog-id=116 op=LOAD Dec 16 12:17:28.969351 kernel: audit: type=1334 audit(1765887448.967:401): prog-id=115 op=LOAD Dec 16 12:17:28.969381 kernel: audit: type=1334 audit(1765887448.967:402): prog-id=76 op=UNLOAD Dec 16 12:17:28.969395 kernel: audit: type=1334 audit(1765887448.967:403): prog-id=77 op=UNLOAD Dec 16 12:17:28.969411 kernel: audit: type=1334 audit(1765887448.967:404): prog-id=116 op=LOAD Dec 16 12:17:28.967000 audit: BPF prog-id=63 op=UNLOAD Dec 16 12:17:28.972256 kernel: audit: type=1334 audit(1765887448.967:405): prog-id=63 op=UNLOAD Dec 16 12:17:28.972285 kernel: audit: type=1334 audit(1765887448.969:406): prog-id=117 op=LOAD Dec 16 12:17:28.969000 audit: BPF prog-id=117 op=LOAD Dec 16 12:17:28.969000 audit: BPF prog-id=69 op=UNLOAD Dec 16 12:17:28.969000 audit: BPF prog-id=118 op=LOAD Dec 16 12:17:28.988000 audit: BPF prog-id=119 op=LOAD Dec 16 12:17:28.988000 audit: BPF prog-id=70 op=UNLOAD Dec 16 12:17:28.988000 audit: BPF prog-id=71 op=UNLOAD Dec 16 12:17:28.989000 audit: BPF prog-id=120 op=LOAD Dec 16 12:17:28.989000 audit: BPF prog-id=78 op=UNLOAD Dec 16 12:17:28.989000 audit: BPF prog-id=121 op=LOAD Dec 16 12:17:28.989000 audit: BPF prog-id=122 op=LOAD Dec 16 12:17:28.989000 audit: BPF prog-id=67 op=UNLOAD Dec 16 12:17:28.989000 audit: BPF prog-id=68 op=UNLOAD Dec 16 12:17:28.991000 audit: BPF prog-id=123 op=LOAD Dec 16 12:17:28.991000 audit: BPF prog-id=79 op=UNLOAD Dec 16 12:17:28.992000 audit: BPF prog-id=124 op=LOAD Dec 16 12:17:28.992000 audit: BPF prog-id=72 op=UNLOAD Dec 16 12:17:28.992000 audit: BPF prog-id=125 op=LOAD Dec 16 12:17:28.992000 audit: BPF prog-id=126 op=LOAD Dec 16 12:17:28.992000 audit: BPF prog-id=73 op=UNLOAD Dec 16 12:17:28.992000 audit: BPF prog-id=74 op=UNLOAD Dec 16 12:17:28.992000 audit: BPF prog-id=127 op=LOAD Dec 16 12:17:28.992000 audit: BPF prog-id=64 op=UNLOAD Dec 16 12:17:28.992000 audit: BPF prog-id=128 op=LOAD Dec 16 12:17:28.992000 audit: BPF prog-id=129 op=LOAD Dec 16 12:17:28.992000 audit: BPF prog-id=65 op=UNLOAD Dec 16 12:17:28.992000 audit: BPF prog-id=66 op=UNLOAD Dec 16 12:17:28.993000 audit: BPF prog-id=130 op=LOAD Dec 16 12:17:28.993000 audit: BPF prog-id=80 op=UNLOAD Dec 16 12:17:28.993000 audit: BPF prog-id=131 op=LOAD Dec 16 12:17:28.993000 audit: BPF prog-id=132 op=LOAD Dec 16 12:17:28.993000 audit: BPF prog-id=81 op=UNLOAD Dec 16 12:17:28.993000 audit: BPF prog-id=82 op=UNLOAD Dec 16 12:17:31.941336 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:17:31.940000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:31.946034 (kubelet)[2932]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 16 12:17:32.041518 kubelet[2932]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 16 12:17:32.041518 kubelet[2932]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 12:17:32.041853 kubelet[2932]: I1216 12:17:32.041545 2932 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 12:17:32.047141 kubelet[2932]: I1216 12:17:32.047077 2932 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Dec 16 12:17:32.047141 kubelet[2932]: I1216 12:17:32.047103 2932 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 12:17:32.047141 kubelet[2932]: I1216 12:17:32.047133 2932 watchdog_linux.go:95] "Systemd watchdog is not enabled" Dec 16 12:17:32.047141 kubelet[2932]: I1216 12:17:32.047139 2932 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 16 12:17:32.047420 kubelet[2932]: I1216 12:17:32.047312 2932 server.go:956] "Client rotation is on, will bootstrap in background" Dec 16 12:17:32.048461 kubelet[2932]: I1216 12:17:32.048436 2932 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Dec 16 12:17:32.050423 kubelet[2932]: I1216 12:17:32.050368 2932 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 12:17:32.056555 kubelet[2932]: I1216 12:17:32.056533 2932 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 12:17:32.059369 kubelet[2932]: I1216 12:17:32.059346 2932 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Dec 16 12:17:32.059544 kubelet[2932]: I1216 12:17:32.059519 2932 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 12:17:32.059689 kubelet[2932]: I1216 12:17:32.059545 2932 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4547-0-0-5-b12717c6ea","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 12:17:32.059761 kubelet[2932]: I1216 12:17:32.059691 2932 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 12:17:32.059761 kubelet[2932]: I1216 12:17:32.059700 2932 container_manager_linux.go:306] "Creating device plugin manager" Dec 16 12:17:32.059864 kubelet[2932]: I1216 12:17:32.059844 2932 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Dec 16 12:17:32.060905 kubelet[2932]: I1216 12:17:32.060773 2932 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:17:32.061091 kubelet[2932]: I1216 12:17:32.061076 2932 kubelet.go:475] "Attempting to sync node with API server" Dec 16 12:17:32.061130 kubelet[2932]: I1216 12:17:32.061095 2932 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 12:17:32.061130 kubelet[2932]: I1216 12:17:32.061116 2932 kubelet.go:387] "Adding apiserver pod source" Dec 16 12:17:32.061130 kubelet[2932]: I1216 12:17:32.061126 2932 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 12:17:32.062049 kubelet[2932]: I1216 12:17:32.062020 2932 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Dec 16 12:17:32.063408 kubelet[2932]: I1216 12:17:32.063382 2932 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 16 12:17:32.063467 kubelet[2932]: I1216 12:17:32.063420 2932 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Dec 16 12:17:32.068386 kubelet[2932]: I1216 12:17:32.067196 2932 server.go:1262] "Started kubelet" Dec 16 12:17:32.068386 kubelet[2932]: I1216 12:17:32.068249 2932 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 12:17:32.069767 kubelet[2932]: I1216 12:17:32.069717 2932 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 12:17:32.072556 kubelet[2932]: I1216 12:17:32.072522 2932 server.go:310] "Adding debug handlers to kubelet server" Dec 16 12:17:32.076887 kubelet[2932]: I1216 12:17:32.076805 2932 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 12:17:32.076887 kubelet[2932]: I1216 12:17:32.076896 2932 server_v1.go:49] "podresources" method="list" useActivePods=true Dec 16 12:17:32.077252 kubelet[2932]: I1216 12:17:32.077097 2932 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 12:17:32.081012 kubelet[2932]: I1216 12:17:32.080956 2932 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 16 12:17:32.091941 kubelet[2932]: I1216 12:17:32.091851 2932 volume_manager.go:313] "Starting Kubelet Volume Manager" Dec 16 12:17:32.092142 kubelet[2932]: I1216 12:17:32.092100 2932 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 16 12:17:32.092340 kubelet[2932]: I1216 12:17:32.092237 2932 reconciler.go:29] "Reconciler: start to sync state" Dec 16 12:17:32.093582 kubelet[2932]: I1216 12:17:32.093555 2932 factory.go:223] Registration of the systemd container factory successfully Dec 16 12:17:32.093680 kubelet[2932]: I1216 12:17:32.093657 2932 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 16 12:17:32.094328 kubelet[2932]: E1216 12:17:32.094289 2932 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 16 12:17:32.094697 kubelet[2932]: I1216 12:17:32.094629 2932 factory.go:223] Registration of the containerd container factory successfully Dec 16 12:17:32.106274 kubelet[2932]: I1216 12:17:32.106228 2932 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Dec 16 12:17:32.108026 kubelet[2932]: I1216 12:17:32.108005 2932 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Dec 16 12:17:32.108237 kubelet[2932]: I1216 12:17:32.108221 2932 status_manager.go:244] "Starting to sync pod status with apiserver" Dec 16 12:17:32.108473 kubelet[2932]: I1216 12:17:32.108458 2932 kubelet.go:2427] "Starting kubelet main sync loop" Dec 16 12:17:32.108643 kubelet[2932]: E1216 12:17:32.108604 2932 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 12:17:32.130024 kubelet[2932]: I1216 12:17:32.129993 2932 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 16 12:17:32.130024 kubelet[2932]: I1216 12:17:32.130014 2932 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 16 12:17:32.130024 kubelet[2932]: I1216 12:17:32.130036 2932 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:17:32.130174 kubelet[2932]: I1216 12:17:32.130161 2932 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 16 12:17:32.130196 kubelet[2932]: I1216 12:17:32.130175 2932 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 16 12:17:32.130196 kubelet[2932]: I1216 12:17:32.130193 2932 policy_none.go:49] "None policy: Start" Dec 16 12:17:32.130230 kubelet[2932]: I1216 12:17:32.130200 2932 memory_manager.go:187] "Starting memorymanager" policy="None" Dec 16 12:17:32.130230 kubelet[2932]: I1216 12:17:32.130208 2932 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Dec 16 12:17:32.130313 kubelet[2932]: I1216 12:17:32.130292 2932 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Dec 16 12:17:32.130313 kubelet[2932]: I1216 12:17:32.130307 2932 policy_none.go:47] "Start" Dec 16 12:17:32.135294 kubelet[2932]: E1216 12:17:32.135272 2932 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 16 12:17:32.135456 kubelet[2932]: I1216 12:17:32.135438 2932 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 12:17:32.135607 kubelet[2932]: I1216 12:17:32.135467 2932 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 12:17:32.135726 kubelet[2932]: I1216 12:17:32.135704 2932 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 12:17:32.137836 kubelet[2932]: E1216 12:17:32.137688 2932 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 16 12:17:32.210758 kubelet[2932]: I1216 12:17:32.210327 2932 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:32.210758 kubelet[2932]: I1216 12:17:32.210400 2932 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:32.210758 kubelet[2932]: I1216 12:17:32.210429 2932 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:32.242159 kubelet[2932]: I1216 12:17:32.242135 2932 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:32.250359 kubelet[2932]: I1216 12:17:32.250315 2932 kubelet_node_status.go:124] "Node was previously registered" node="ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:32.250445 kubelet[2932]: I1216 12:17:32.250396 2932 kubelet_node_status.go:78] "Successfully registered node" node="ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:32.393677 kubelet[2932]: I1216 12:17:32.393626 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/21d9a165cfb1b149e915c6ce7312572d-flexvolume-dir\") pod \"kube-controller-manager-ci-4547-0-0-5-b12717c6ea\" (UID: \"21d9a165cfb1b149e915c6ce7312572d\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:32.393677 kubelet[2932]: I1216 12:17:32.393671 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/21d9a165cfb1b149e915c6ce7312572d-kubeconfig\") pod \"kube-controller-manager-ci-4547-0-0-5-b12717c6ea\" (UID: \"21d9a165cfb1b149e915c6ce7312572d\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:32.393961 kubelet[2932]: I1216 12:17:32.393693 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/21d9a165cfb1b149e915c6ce7312572d-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4547-0-0-5-b12717c6ea\" (UID: \"21d9a165cfb1b149e915c6ce7312572d\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:32.393961 kubelet[2932]: I1216 12:17:32.393714 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/fb4ed8f3457eaa37af86999d43fb78e9-kubeconfig\") pod \"kube-scheduler-ci-4547-0-0-5-b12717c6ea\" (UID: \"fb4ed8f3457eaa37af86999d43fb78e9\") " pod="kube-system/kube-scheduler-ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:32.393961 kubelet[2932]: I1216 12:17:32.393759 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/a832406494b9bda518195f77cb4874f0-k8s-certs\") pod \"kube-apiserver-ci-4547-0-0-5-b12717c6ea\" (UID: \"a832406494b9bda518195f77cb4874f0\") " pod="kube-system/kube-apiserver-ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:32.393961 kubelet[2932]: I1216 12:17:32.393801 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/21d9a165cfb1b149e915c6ce7312572d-ca-certs\") pod \"kube-controller-manager-ci-4547-0-0-5-b12717c6ea\" (UID: \"21d9a165cfb1b149e915c6ce7312572d\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:32.393961 kubelet[2932]: I1216 12:17:32.393889 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/21d9a165cfb1b149e915c6ce7312572d-k8s-certs\") pod \"kube-controller-manager-ci-4547-0-0-5-b12717c6ea\" (UID: \"21d9a165cfb1b149e915c6ce7312572d\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:32.394113 kubelet[2932]: I1216 12:17:32.393938 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/a832406494b9bda518195f77cb4874f0-ca-certs\") pod \"kube-apiserver-ci-4547-0-0-5-b12717c6ea\" (UID: \"a832406494b9bda518195f77cb4874f0\") " pod="kube-system/kube-apiserver-ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:32.394113 kubelet[2932]: I1216 12:17:32.393981 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/a832406494b9bda518195f77cb4874f0-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4547-0-0-5-b12717c6ea\" (UID: \"a832406494b9bda518195f77cb4874f0\") " pod="kube-system/kube-apiserver-ci-4547-0-0-5-b12717c6ea" Dec 16 12:17:33.062271 kubelet[2932]: I1216 12:17:33.062176 2932 apiserver.go:52] "Watching apiserver" Dec 16 12:17:33.092983 kubelet[2932]: I1216 12:17:33.092946 2932 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 16 12:17:33.150597 kubelet[2932]: I1216 12:17:33.150389 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4547-0-0-5-b12717c6ea" podStartSLOduration=1.1503732979999999 podStartE2EDuration="1.150373298s" podCreationTimestamp="2025-12-16 12:17:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:17:33.140300266 +0000 UTC m=+1.191215377" watchObservedRunningTime="2025-12-16 12:17:33.150373298 +0000 UTC m=+1.201288369" Dec 16 12:17:33.150597 kubelet[2932]: I1216 12:17:33.150508 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4547-0-0-5-b12717c6ea" podStartSLOduration=1.1505037790000001 podStartE2EDuration="1.150503779s" podCreationTimestamp="2025-12-16 12:17:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:17:33.150473299 +0000 UTC m=+1.201388450" watchObservedRunningTime="2025-12-16 12:17:33.150503779 +0000 UTC m=+1.201418890" Dec 16 12:17:33.171588 kubelet[2932]: I1216 12:17:33.171503 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4547-0-0-5-b12717c6ea" podStartSLOduration=1.171474606 podStartE2EDuration="1.171474606s" podCreationTimestamp="2025-12-16 12:17:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:17:33.161664255 +0000 UTC m=+1.212579366" watchObservedRunningTime="2025-12-16 12:17:33.171474606 +0000 UTC m=+1.222389717" Dec 16 12:17:35.862911 kubelet[2932]: I1216 12:17:35.862877 2932 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Dec 16 12:17:35.863513 containerd[1662]: time="2025-12-16T12:17:35.863187868Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Dec 16 12:17:35.864053 kubelet[2932]: I1216 12:17:35.863738 2932 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Dec 16 12:17:37.008007 systemd[1]: Created slice kubepods-besteffort-pod79c0228e_bf0d_434f_9534_f74ac2c4029a.slice - libcontainer container kubepods-besteffort-pod79c0228e_bf0d_434f_9534_f74ac2c4029a.slice. Dec 16 12:17:37.024839 kubelet[2932]: I1216 12:17:37.024688 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/79c0228e-bf0d-434f-9534-f74ac2c4029a-lib-modules\") pod \"kube-proxy-mrmgg\" (UID: \"79c0228e-bf0d-434f-9534-f74ac2c4029a\") " pod="kube-system/kube-proxy-mrmgg" Dec 16 12:17:37.024839 kubelet[2932]: I1216 12:17:37.024731 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7g7c\" (UniqueName: \"kubernetes.io/projected/79c0228e-bf0d-434f-9534-f74ac2c4029a-kube-api-access-b7g7c\") pod \"kube-proxy-mrmgg\" (UID: \"79c0228e-bf0d-434f-9534-f74ac2c4029a\") " pod="kube-system/kube-proxy-mrmgg" Dec 16 12:17:37.024839 kubelet[2932]: I1216 12:17:37.024755 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/79c0228e-bf0d-434f-9534-f74ac2c4029a-kube-proxy\") pod \"kube-proxy-mrmgg\" (UID: \"79c0228e-bf0d-434f-9534-f74ac2c4029a\") " pod="kube-system/kube-proxy-mrmgg" Dec 16 12:17:37.024839 kubelet[2932]: I1216 12:17:37.024769 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/79c0228e-bf0d-434f-9534-f74ac2c4029a-xtables-lock\") pod \"kube-proxy-mrmgg\" (UID: \"79c0228e-bf0d-434f-9534-f74ac2c4029a\") " pod="kube-system/kube-proxy-mrmgg" Dec 16 12:17:37.073164 systemd[1]: Created slice kubepods-besteffort-pod381898a3_7c53_40de_8c60_fd63fe146417.slice - libcontainer container kubepods-besteffort-pod381898a3_7c53_40de_8c60_fd63fe146417.slice. Dec 16 12:17:37.125858 kubelet[2932]: I1216 12:17:37.125315 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26wrf\" (UniqueName: \"kubernetes.io/projected/381898a3-7c53-40de-8c60-fd63fe146417-kube-api-access-26wrf\") pod \"tigera-operator-65cdcdfd6d-4tjvn\" (UID: \"381898a3-7c53-40de-8c60-fd63fe146417\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-4tjvn" Dec 16 12:17:37.125858 kubelet[2932]: I1216 12:17:37.125367 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/381898a3-7c53-40de-8c60-fd63fe146417-var-lib-calico\") pod \"tigera-operator-65cdcdfd6d-4tjvn\" (UID: \"381898a3-7c53-40de-8c60-fd63fe146417\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-4tjvn" Dec 16 12:17:37.321327 containerd[1662]: time="2025-12-16T12:17:37.321231498Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-mrmgg,Uid:79c0228e-bf0d-434f-9534-f74ac2c4029a,Namespace:kube-system,Attempt:0,}" Dec 16 12:17:37.340580 containerd[1662]: time="2025-12-16T12:17:37.340477720Z" level=info msg="connecting to shim 5ef050c6bf9e2b2afd33132ea4758dd948704f58a657c8fe89c2a8d431e8ac93" address="unix:///run/containerd/s/f08b76858e26412b5b9a5514b02972e9d83f574181638f953561ad67ac54e4fa" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:17:37.362117 systemd[1]: Started cri-containerd-5ef050c6bf9e2b2afd33132ea4758dd948704f58a657c8fe89c2a8d431e8ac93.scope - libcontainer container 5ef050c6bf9e2b2afd33132ea4758dd948704f58a657c8fe89c2a8d431e8ac93. Dec 16 12:17:37.373345 kernel: kauditd_printk_skb: 32 callbacks suppressed Dec 16 12:17:37.373450 kernel: audit: type=1334 audit(1765887457.369:439): prog-id=133 op=LOAD Dec 16 12:17:37.369000 audit: BPF prog-id=133 op=LOAD Dec 16 12:17:37.370000 audit: BPF prog-id=134 op=LOAD Dec 16 12:17:37.374487 kernel: audit: type=1334 audit(1765887457.370:440): prog-id=134 op=LOAD Dec 16 12:17:37.370000 audit[3005]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2993 pid=3005 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.378326 kernel: audit: type=1300 audit(1765887457.370:440): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2993 pid=3005 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.370000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565663035306336626639653262326166643333313332656134373538 Dec 16 12:17:37.381446 containerd[1662]: time="2025-12-16T12:17:37.381409531Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-4tjvn,Uid:381898a3-7c53-40de-8c60-fd63fe146417,Namespace:tigera-operator,Attempt:0,}" Dec 16 12:17:37.382467 kernel: audit: type=1327 audit(1765887457.370:440): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565663035306336626639653262326166643333313332656134373538 Dec 16 12:17:37.370000 audit: BPF prog-id=134 op=UNLOAD Dec 16 12:17:37.384431 kernel: audit: type=1334 audit(1765887457.370:441): prog-id=134 op=UNLOAD Dec 16 12:17:37.370000 audit[3005]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2993 pid=3005 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.388606 kernel: audit: type=1300 audit(1765887457.370:441): arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2993 pid=3005 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.370000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565663035306336626639653262326166643333313332656134373538 Dec 16 12:17:37.392280 kernel: audit: type=1327 audit(1765887457.370:441): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565663035306336626639653262326166643333313332656134373538 Dec 16 12:17:37.370000 audit: BPF prog-id=135 op=LOAD Dec 16 12:17:37.393521 kernel: audit: type=1334 audit(1765887457.370:442): prog-id=135 op=LOAD Dec 16 12:17:37.370000 audit[3005]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2993 pid=3005 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.398343 kernel: audit: type=1300 audit(1765887457.370:442): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2993 pid=3005 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.370000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565663035306336626639653262326166643333313332656134373538 Dec 16 12:17:37.402390 kernel: audit: type=1327 audit(1765887457.370:442): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565663035306336626639653262326166643333313332656134373538 Dec 16 12:17:37.371000 audit: BPF prog-id=136 op=LOAD Dec 16 12:17:37.371000 audit[3005]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2993 pid=3005 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.371000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565663035306336626639653262326166643333313332656134373538 Dec 16 12:17:37.371000 audit: BPF prog-id=136 op=UNLOAD Dec 16 12:17:37.371000 audit[3005]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2993 pid=3005 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.371000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565663035306336626639653262326166643333313332656134373538 Dec 16 12:17:37.371000 audit: BPF prog-id=135 op=UNLOAD Dec 16 12:17:37.371000 audit[3005]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2993 pid=3005 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.371000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565663035306336626639653262326166643333313332656134373538 Dec 16 12:17:37.371000 audit: BPF prog-id=137 op=LOAD Dec 16 12:17:37.371000 audit[3005]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2993 pid=3005 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.371000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3565663035306336626639653262326166643333313332656134373538 Dec 16 12:17:37.408889 containerd[1662]: time="2025-12-16T12:17:37.408841019Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-mrmgg,Uid:79c0228e-bf0d-434f-9534-f74ac2c4029a,Namespace:kube-system,Attempt:0,} returns sandbox id \"5ef050c6bf9e2b2afd33132ea4758dd948704f58a657c8fe89c2a8d431e8ac93\"" Dec 16 12:17:37.412000 containerd[1662]: time="2025-12-16T12:17:37.411953509Z" level=info msg="connecting to shim 5640012fb8de3586741da1326b5f872a7599db377180951bb739df9a954974c5" address="unix:///run/containerd/s/69ff2fc689f8a8df5d9a3a4e3c3a262c789426956546e950d417f472c72c1163" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:17:37.414801 containerd[1662]: time="2025-12-16T12:17:37.413835755Z" level=info msg="CreateContainer within sandbox \"5ef050c6bf9e2b2afd33132ea4758dd948704f58a657c8fe89c2a8d431e8ac93\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Dec 16 12:17:37.424244 containerd[1662]: time="2025-12-16T12:17:37.424194508Z" level=info msg="Container 87d6a340587d15d3190984881d0b3cfafc98a3c771df9eb7e8ac3b0ba80c6e8f: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:17:37.433501 containerd[1662]: time="2025-12-16T12:17:37.433455058Z" level=info msg="CreateContainer within sandbox \"5ef050c6bf9e2b2afd33132ea4758dd948704f58a657c8fe89c2a8d431e8ac93\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"87d6a340587d15d3190984881d0b3cfafc98a3c771df9eb7e8ac3b0ba80c6e8f\"" Dec 16 12:17:37.434009 containerd[1662]: time="2025-12-16T12:17:37.433967939Z" level=info msg="StartContainer for \"87d6a340587d15d3190984881d0b3cfafc98a3c771df9eb7e8ac3b0ba80c6e8f\"" Dec 16 12:17:37.435667 containerd[1662]: time="2025-12-16T12:17:37.435639345Z" level=info msg="connecting to shim 87d6a340587d15d3190984881d0b3cfafc98a3c771df9eb7e8ac3b0ba80c6e8f" address="unix:///run/containerd/s/f08b76858e26412b5b9a5514b02972e9d83f574181638f953561ad67ac54e4fa" protocol=ttrpc version=3 Dec 16 12:17:37.438994 systemd[1]: Started cri-containerd-5640012fb8de3586741da1326b5f872a7599db377180951bb739df9a954974c5.scope - libcontainer container 5640012fb8de3586741da1326b5f872a7599db377180951bb739df9a954974c5. Dec 16 12:17:37.459215 systemd[1]: Started cri-containerd-87d6a340587d15d3190984881d0b3cfafc98a3c771df9eb7e8ac3b0ba80c6e8f.scope - libcontainer container 87d6a340587d15d3190984881d0b3cfafc98a3c771df9eb7e8ac3b0ba80c6e8f. Dec 16 12:17:37.460000 audit: BPF prog-id=138 op=LOAD Dec 16 12:17:37.460000 audit: BPF prog-id=139 op=LOAD Dec 16 12:17:37.460000 audit[3051]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3040 pid=3051 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.460000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3536343030313266623864653335383637343164613133323662356638 Dec 16 12:17:37.460000 audit: BPF prog-id=139 op=UNLOAD Dec 16 12:17:37.460000 audit[3051]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3040 pid=3051 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.460000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3536343030313266623864653335383637343164613133323662356638 Dec 16 12:17:37.461000 audit: BPF prog-id=140 op=LOAD Dec 16 12:17:37.461000 audit[3051]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3040 pid=3051 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.461000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3536343030313266623864653335383637343164613133323662356638 Dec 16 12:17:37.461000 audit: BPF prog-id=141 op=LOAD Dec 16 12:17:37.461000 audit[3051]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3040 pid=3051 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.461000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3536343030313266623864653335383637343164613133323662356638 Dec 16 12:17:37.461000 audit: BPF prog-id=141 op=UNLOAD Dec 16 12:17:37.461000 audit[3051]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3040 pid=3051 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.461000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3536343030313266623864653335383637343164613133323662356638 Dec 16 12:17:37.461000 audit: BPF prog-id=140 op=UNLOAD Dec 16 12:17:37.461000 audit[3051]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3040 pid=3051 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.461000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3536343030313266623864653335383637343164613133323662356638 Dec 16 12:17:37.461000 audit: BPF prog-id=142 op=LOAD Dec 16 12:17:37.461000 audit[3051]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3040 pid=3051 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.461000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3536343030313266623864653335383637343164613133323662356638 Dec 16 12:17:37.488127 containerd[1662]: time="2025-12-16T12:17:37.488052592Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-4tjvn,Uid:381898a3-7c53-40de-8c60-fd63fe146417,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"5640012fb8de3586741da1326b5f872a7599db377180951bb739df9a954974c5\"" Dec 16 12:17:37.490077 containerd[1662]: time="2025-12-16T12:17:37.490050639Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Dec 16 12:17:37.512000 audit: BPF prog-id=143 op=LOAD Dec 16 12:17:37.512000 audit[3063]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2993 pid=3063 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.512000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3837643661333430353837643135643331393039383438383164306233 Dec 16 12:17:37.512000 audit: BPF prog-id=144 op=LOAD Dec 16 12:17:37.512000 audit[3063]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2993 pid=3063 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.512000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3837643661333430353837643135643331393039383438383164306233 Dec 16 12:17:37.512000 audit: BPF prog-id=144 op=UNLOAD Dec 16 12:17:37.512000 audit[3063]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2993 pid=3063 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.512000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3837643661333430353837643135643331393039383438383164306233 Dec 16 12:17:37.512000 audit: BPF prog-id=143 op=UNLOAD Dec 16 12:17:37.512000 audit[3063]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2993 pid=3063 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.512000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3837643661333430353837643135643331393039383438383164306233 Dec 16 12:17:37.512000 audit: BPF prog-id=145 op=LOAD Dec 16 12:17:37.512000 audit[3063]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2993 pid=3063 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.512000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3837643661333430353837643135643331393039383438383164306233 Dec 16 12:17:37.530986 containerd[1662]: time="2025-12-16T12:17:37.530880530Z" level=info msg="StartContainer for \"87d6a340587d15d3190984881d0b3cfafc98a3c771df9eb7e8ac3b0ba80c6e8f\" returns successfully" Dec 16 12:17:37.759000 audit[3142]: NETFILTER_CFG table=mangle:54 family=2 entries=1 op=nft_register_chain pid=3142 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:37.759000 audit[3142]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffde8b5060 a2=0 a3=1 items=0 ppid=3084 pid=3142 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.759000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Dec 16 12:17:37.761000 audit[3143]: NETFILTER_CFG table=mangle:55 family=10 entries=1 op=nft_register_chain pid=3143 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:37.761000 audit[3143]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff2f3b3e0 a2=0 a3=1 items=0 ppid=3084 pid=3143 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.761000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Dec 16 12:17:37.762000 audit[3146]: NETFILTER_CFG table=nat:56 family=2 entries=1 op=nft_register_chain pid=3146 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:37.762000 audit[3146]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd203cec0 a2=0 a3=1 items=0 ppid=3084 pid=3146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.762000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Dec 16 12:17:37.764000 audit[3149]: NETFILTER_CFG table=filter:57 family=2 entries=1 op=nft_register_chain pid=3149 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:37.764000 audit[3149]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff9855a60 a2=0 a3=1 items=0 ppid=3084 pid=3149 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.764000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Dec 16 12:17:37.764000 audit[3148]: NETFILTER_CFG table=nat:58 family=10 entries=1 op=nft_register_chain pid=3148 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:37.764000 audit[3148]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffeb35e870 a2=0 a3=1 items=0 ppid=3084 pid=3148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.764000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Dec 16 12:17:37.766000 audit[3150]: NETFILTER_CFG table=filter:59 family=10 entries=1 op=nft_register_chain pid=3150 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:37.766000 audit[3150]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff2dfff10 a2=0 a3=1 items=0 ppid=3084 pid=3150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.766000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Dec 16 12:17:37.861000 audit[3151]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3151 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:37.861000 audit[3151]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=ffffc606f630 a2=0 a3=1 items=0 ppid=3084 pid=3151 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.861000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Dec 16 12:17:37.863000 audit[3153]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3153 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:37.863000 audit[3153]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffc21afa50 a2=0 a3=1 items=0 ppid=3084 pid=3153 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.863000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73002D Dec 16 12:17:37.866000 audit[3156]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3156 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:37.866000 audit[3156]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffe1d60590 a2=0 a3=1 items=0 ppid=3084 pid=3156 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.866000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73 Dec 16 12:17:37.868000 audit[3157]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3157 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:37.868000 audit[3157]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff9d91bd0 a2=0 a3=1 items=0 ppid=3084 pid=3157 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.868000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Dec 16 12:17:37.870000 audit[3159]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3159 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:37.870000 audit[3159]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=fffff3ad2870 a2=0 a3=1 items=0 ppid=3084 pid=3159 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.870000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Dec 16 12:17:37.871000 audit[3160]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3160 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:37.871000 audit[3160]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd5dd4190 a2=0 a3=1 items=0 ppid=3084 pid=3160 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.871000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D5345525649434553002D740066696C746572 Dec 16 12:17:37.873000 audit[3162]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3162 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:37.873000 audit[3162]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=fffff2438500 a2=0 a3=1 items=0 ppid=3084 pid=3162 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.873000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:17:37.876000 audit[3165]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3165 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:37.876000 audit[3165]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=fffff04832b0 a2=0 a3=1 items=0 ppid=3084 pid=3165 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.876000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:17:37.877000 audit[3166]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3166 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:37.877000 audit[3166]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffffe409db0 a2=0 a3=1 items=0 ppid=3084 pid=3166 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.877000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D464F5257415244002D740066696C746572 Dec 16 12:17:37.879000 audit[3168]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3168 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:37.879000 audit[3168]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffcdba0990 a2=0 a3=1 items=0 ppid=3084 pid=3168 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.879000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Dec 16 12:17:37.880000 audit[3169]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3169 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:37.880000 audit[3169]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffeef37fd0 a2=0 a3=1 items=0 ppid=3084 pid=3169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.880000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Dec 16 12:17:37.883000 audit[3171]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3171 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:37.883000 audit[3171]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffffd73cea0 a2=0 a3=1 items=0 ppid=3084 pid=3171 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.883000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F5859 Dec 16 12:17:37.886000 audit[3174]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3174 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:37.886000 audit[3174]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffe0a4edd0 a2=0 a3=1 items=0 ppid=3084 pid=3174 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.886000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F58 Dec 16 12:17:37.890000 audit[3177]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3177 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:37.890000 audit[3177]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffffb982ed0 a2=0 a3=1 items=0 ppid=3084 pid=3177 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.890000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F Dec 16 12:17:37.891000 audit[3178]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3178 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:37.891000 audit[3178]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffc02e93c0 a2=0 a3=1 items=0 ppid=3084 pid=3178 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.891000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D5345525649434553002D74006E6174 Dec 16 12:17:37.893000 audit[3180]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3180 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:37.893000 audit[3180]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=ffffe2b3ad40 a2=0 a3=1 items=0 ppid=3084 pid=3180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.893000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:17:37.897000 audit[3183]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3183 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:37.897000 audit[3183]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffd6025650 a2=0 a3=1 items=0 ppid=3084 pid=3183 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.897000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:17:37.898000 audit[3184]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3184 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:37.898000 audit[3184]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffdd760140 a2=0 a3=1 items=0 ppid=3084 pid=3184 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.898000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Dec 16 12:17:37.901000 audit[3186]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3186 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:17:37.901000 audit[3186]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=532 a0=3 a1=ffffff657dc0 a2=0 a3=1 items=0 ppid=3084 pid=3186 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.901000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Dec 16 12:17:37.925000 audit[3192]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3192 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:17:37.925000 audit[3192]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffc023d390 a2=0 a3=1 items=0 ppid=3084 pid=3192 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.925000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:17:37.941000 audit[3192]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3192 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:17:37.941000 audit[3192]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5508 a0=3 a1=ffffc023d390 a2=0 a3=1 items=0 ppid=3084 pid=3192 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.941000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:17:37.943000 audit[3197]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3197 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:37.943000 audit[3197]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=fffffd7b29b0 a2=0 a3=1 items=0 ppid=3084 pid=3197 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.943000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Dec 16 12:17:37.945000 audit[3199]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3199 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:37.945000 audit[3199]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=836 a0=3 a1=fffff5940db0 a2=0 a3=1 items=0 ppid=3084 pid=3199 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.945000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73 Dec 16 12:17:37.949000 audit[3202]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3202 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:37.949000 audit[3202]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffc8d41790 a2=0 a3=1 items=0 ppid=3084 pid=3202 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.949000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C Dec 16 12:17:37.950000 audit[3203]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3203 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:37.950000 audit[3203]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffde61ef70 a2=0 a3=1 items=0 ppid=3084 pid=3203 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.950000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Dec 16 12:17:37.952000 audit[3205]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3205 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:37.952000 audit[3205]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=fffff17599b0 a2=0 a3=1 items=0 ppid=3084 pid=3205 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.952000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Dec 16 12:17:37.953000 audit[3206]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3206 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:37.953000 audit[3206]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd087c380 a2=0 a3=1 items=0 ppid=3084 pid=3206 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.953000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D5345525649434553002D740066696C746572 Dec 16 12:17:37.956000 audit[3208]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3208 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:37.956000 audit[3208]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffd253e5b0 a2=0 a3=1 items=0 ppid=3084 pid=3208 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.956000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:17:37.959000 audit[3211]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3211 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:37.959000 audit[3211]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=828 a0=3 a1=fffff04e0330 a2=0 a3=1 items=0 ppid=3084 pid=3211 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.959000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:17:37.960000 audit[3212]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3212 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:37.960000 audit[3212]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffdae58890 a2=0 a3=1 items=0 ppid=3084 pid=3212 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.960000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D464F5257415244002D740066696C746572 Dec 16 12:17:37.962000 audit[3214]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3214 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:37.962000 audit[3214]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffd5246220 a2=0 a3=1 items=0 ppid=3084 pid=3214 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.962000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Dec 16 12:17:37.963000 audit[3215]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3215 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:37.963000 audit[3215]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc7909710 a2=0 a3=1 items=0 ppid=3084 pid=3215 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.963000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Dec 16 12:17:37.965000 audit[3217]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3217 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:37.965000 audit[3217]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffcc2fe430 a2=0 a3=1 items=0 ppid=3084 pid=3217 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.965000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F58 Dec 16 12:17:37.969000 audit[3220]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3220 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:37.969000 audit[3220]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffffebad700 a2=0 a3=1 items=0 ppid=3084 pid=3220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.969000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F Dec 16 12:17:37.972000 audit[3223]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3223 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:37.972000 audit[3223]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffe5b16e20 a2=0 a3=1 items=0 ppid=3084 pid=3223 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.972000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D5052 Dec 16 12:17:37.974000 audit[3224]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3224 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:37.974000 audit[3224]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffdd127490 a2=0 a3=1 items=0 ppid=3084 pid=3224 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.974000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D5345525649434553002D74006E6174 Dec 16 12:17:37.976000 audit[3226]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3226 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:37.976000 audit[3226]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=ffffe5bbf9c0 a2=0 a3=1 items=0 ppid=3084 pid=3226 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.976000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:17:37.979000 audit[3229]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3229 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:37.979000 audit[3229]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffc91b0fa0 a2=0 a3=1 items=0 ppid=3084 pid=3229 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.979000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:17:37.980000 audit[3230]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3230 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:37.980000 audit[3230]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff56330a0 a2=0 a3=1 items=0 ppid=3084 pid=3230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.980000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Dec 16 12:17:37.982000 audit[3232]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3232 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:37.982000 audit[3232]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=612 a0=3 a1=ffffc222dfa0 a2=0 a3=1 items=0 ppid=3084 pid=3232 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.982000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Dec 16 12:17:37.984000 audit[3233]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3233 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:37.984000 audit[3233]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffdb8bd870 a2=0 a3=1 items=0 ppid=3084 pid=3233 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.984000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4649524557414C4C002D740066696C746572 Dec 16 12:17:37.986000 audit[3235]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3235 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:37.986000 audit[3235]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=fffffdc934c0 a2=0 a3=1 items=0 ppid=3084 pid=3235 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.986000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:17:37.989000 audit[3238]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3238 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:17:37.989000 audit[3238]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffea765680 a2=0 a3=1 items=0 ppid=3084 pid=3238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.989000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:17:37.993000 audit[3240]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3240 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Dec 16 12:17:37.993000 audit[3240]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2088 a0=3 a1=fffff082f0e0 a2=0 a3=1 items=0 ppid=3084 pid=3240 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.993000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:17:37.993000 audit[3240]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3240 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Dec 16 12:17:37.993000 audit[3240]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2056 a0=3 a1=fffff082f0e0 a2=0 a3=1 items=0 ppid=3084 pid=3240 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.993000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:17:38.143826 kubelet[2932]: I1216 12:17:38.143325 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-mrmgg" podStartSLOduration=2.143311171 podStartE2EDuration="2.143311171s" podCreationTimestamp="2025-12-16 12:17:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:17:38.142401688 +0000 UTC m=+6.193316799" watchObservedRunningTime="2025-12-16 12:17:38.143311171 +0000 UTC m=+6.194226282" Dec 16 12:17:40.289415 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount355959376.mount: Deactivated successfully. Dec 16 12:17:41.252095 containerd[1662]: time="2025-12-16T12:17:41.252028129Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:41.253365 containerd[1662]: time="2025-12-16T12:17:41.253309053Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=20773434" Dec 16 12:17:41.254272 containerd[1662]: time="2025-12-16T12:17:41.254240896Z" level=info msg="ImageCreate event name:\"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:41.256718 containerd[1662]: time="2025-12-16T12:17:41.256675664Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:41.257578 containerd[1662]: time="2025-12-16T12:17:41.257541426Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"22147999\" in 3.767332547s" Dec 16 12:17:41.257619 containerd[1662]: time="2025-12-16T12:17:41.257577707Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\"" Dec 16 12:17:41.261491 containerd[1662]: time="2025-12-16T12:17:41.261459119Z" level=info msg="CreateContainer within sandbox \"5640012fb8de3586741da1326b5f872a7599db377180951bb739df9a954974c5\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Dec 16 12:17:41.268852 containerd[1662]: time="2025-12-16T12:17:41.268377621Z" level=info msg="Container 1ea377966023efa09fdf843774669037373183530e90cbfc3bbd485b58f2c7c4: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:17:41.271267 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3923168848.mount: Deactivated successfully. Dec 16 12:17:41.275507 containerd[1662]: time="2025-12-16T12:17:41.275439124Z" level=info msg="CreateContainer within sandbox \"5640012fb8de3586741da1326b5f872a7599db377180951bb739df9a954974c5\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"1ea377966023efa09fdf843774669037373183530e90cbfc3bbd485b58f2c7c4\"" Dec 16 12:17:41.276016 containerd[1662]: time="2025-12-16T12:17:41.275919645Z" level=info msg="StartContainer for \"1ea377966023efa09fdf843774669037373183530e90cbfc3bbd485b58f2c7c4\"" Dec 16 12:17:41.277089 containerd[1662]: time="2025-12-16T12:17:41.277007849Z" level=info msg="connecting to shim 1ea377966023efa09fdf843774669037373183530e90cbfc3bbd485b58f2c7c4" address="unix:///run/containerd/s/69ff2fc689f8a8df5d9a3a4e3c3a262c789426956546e950d417f472c72c1163" protocol=ttrpc version=3 Dec 16 12:17:41.297167 systemd[1]: Started cri-containerd-1ea377966023efa09fdf843774669037373183530e90cbfc3bbd485b58f2c7c4.scope - libcontainer container 1ea377966023efa09fdf843774669037373183530e90cbfc3bbd485b58f2c7c4. Dec 16 12:17:41.304000 audit: BPF prog-id=146 op=LOAD Dec 16 12:17:41.305000 audit: BPF prog-id=147 op=LOAD Dec 16 12:17:41.305000 audit[3250]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3040 pid=3250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:41.305000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165613337373936363032336566613039666466383433373734363639 Dec 16 12:17:41.305000 audit: BPF prog-id=147 op=UNLOAD Dec 16 12:17:41.305000 audit[3250]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3040 pid=3250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:41.305000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165613337373936363032336566613039666466383433373734363639 Dec 16 12:17:41.305000 audit: BPF prog-id=148 op=LOAD Dec 16 12:17:41.305000 audit[3250]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3040 pid=3250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:41.305000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165613337373936363032336566613039666466383433373734363639 Dec 16 12:17:41.305000 audit: BPF prog-id=149 op=LOAD Dec 16 12:17:41.305000 audit[3250]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3040 pid=3250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:41.305000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165613337373936363032336566613039666466383433373734363639 Dec 16 12:17:41.305000 audit: BPF prog-id=149 op=UNLOAD Dec 16 12:17:41.305000 audit[3250]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3040 pid=3250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:41.305000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165613337373936363032336566613039666466383433373734363639 Dec 16 12:17:41.306000 audit: BPF prog-id=148 op=UNLOAD Dec 16 12:17:41.306000 audit[3250]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3040 pid=3250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:41.306000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165613337373936363032336566613039666466383433373734363639 Dec 16 12:17:41.306000 audit: BPF prog-id=150 op=LOAD Dec 16 12:17:41.306000 audit[3250]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3040 pid=3250 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:41.306000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165613337373936363032336566613039666466383433373734363639 Dec 16 12:17:41.323060 containerd[1662]: time="2025-12-16T12:17:41.323021996Z" level=info msg="StartContainer for \"1ea377966023efa09fdf843774669037373183530e90cbfc3bbd485b58f2c7c4\" returns successfully" Dec 16 12:17:42.177884 kubelet[2932]: I1216 12:17:42.177573 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-65cdcdfd6d-4tjvn" podStartSLOduration=1.40859358 podStartE2EDuration="5.177557333s" podCreationTimestamp="2025-12-16 12:17:37 +0000 UTC" firstStartedPulling="2025-12-16 12:17:37.489316756 +0000 UTC m=+5.540231867" lastFinishedPulling="2025-12-16 12:17:41.258280509 +0000 UTC m=+9.309195620" observedRunningTime="2025-12-16 12:17:42.177333893 +0000 UTC m=+10.228248964" watchObservedRunningTime="2025-12-16 12:17:42.177557333 +0000 UTC m=+10.228472444" Dec 16 12:17:46.603244 sudo[1959]: pam_unix(sudo:session): session closed for user root Dec 16 12:17:46.601000 audit[1959]: USER_END pid=1959 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.603992 kernel: kauditd_printk_skb: 224 callbacks suppressed Dec 16 12:17:46.604035 kernel: audit: type=1106 audit(1765887466.601:519): pid=1959 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.601000 audit[1959]: CRED_DISP pid=1959 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.609468 kernel: audit: type=1104 audit(1765887466.601:520): pid=1959 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.767553 sshd[1958]: Connection closed by 139.178.68.195 port 50918 Dec 16 12:17:46.767127 sshd-session[1954]: pam_unix(sshd:session): session closed for user core Dec 16 12:17:46.767000 audit[1954]: USER_END pid=1954 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:46.772000 audit[1954]: CRED_DISP pid=1954 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:46.777083 systemd[1]: sshd@8-10.0.29.66:22-139.178.68.195:50918.service: Deactivated successfully. Dec 16 12:17:46.779278 kernel: audit: type=1106 audit(1765887466.767:521): pid=1954 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:46.779328 kernel: audit: type=1104 audit(1765887466.772:522): pid=1954 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:46.779347 kernel: audit: type=1131 audit(1765887466.777:523): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.29.66:22-139.178.68.195:50918 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.777000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.29.66:22-139.178.68.195:50918 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.780383 systemd[1]: session-10.scope: Deactivated successfully. Dec 16 12:17:46.780603 systemd[1]: session-10.scope: Consumed 6.674s CPU time, 221.9M memory peak. Dec 16 12:17:46.781956 systemd-logind[1646]: Session 10 logged out. Waiting for processes to exit. Dec 16 12:17:46.783485 systemd-logind[1646]: Removed session 10. Dec 16 12:17:47.623000 audit[3343]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3343 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:17:47.623000 audit[3343]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffc1eb1d30 a2=0 a3=1 items=0 ppid=3084 pid=3343 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:47.630650 kernel: audit: type=1325 audit(1765887467.623:524): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3343 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:17:47.630724 kernel: audit: type=1300 audit(1765887467.623:524): arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffc1eb1d30 a2=0 a3=1 items=0 ppid=3084 pid=3343 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:47.623000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:17:47.632593 kernel: audit: type=1327 audit(1765887467.623:524): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:17:47.634000 audit[3343]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3343 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:17:47.638845 kernel: audit: type=1325 audit(1765887467.634:525): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3343 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:17:47.634000 audit[3343]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffc1eb1d30 a2=0 a3=1 items=0 ppid=3084 pid=3343 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:47.634000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:17:47.645835 kernel: audit: type=1300 audit(1765887467.634:525): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffc1eb1d30 a2=0 a3=1 items=0 ppid=3084 pid=3343 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:48.651000 audit[3345]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3345 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:17:48.651000 audit[3345]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffcb0a2d60 a2=0 a3=1 items=0 ppid=3084 pid=3345 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:48.651000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:17:48.659000 audit[3345]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3345 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:17:48.659000 audit[3345]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffcb0a2d60 a2=0 a3=1 items=0 ppid=3084 pid=3345 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:48.659000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:17:51.639000 audit[3348]: NETFILTER_CFG table=filter:109 family=2 entries=17 op=nft_register_rule pid=3348 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:17:51.644072 kernel: kauditd_printk_skb: 7 callbacks suppressed Dec 16 12:17:51.644230 kernel: audit: type=1325 audit(1765887471.639:528): table=filter:109 family=2 entries=17 op=nft_register_rule pid=3348 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:17:51.639000 audit[3348]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=ffffee83fc80 a2=0 a3=1 items=0 ppid=3084 pid=3348 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:51.649137 kernel: audit: type=1300 audit(1765887471.639:528): arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=ffffee83fc80 a2=0 a3=1 items=0 ppid=3084 pid=3348 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:51.639000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:17:51.652220 kernel: audit: type=1327 audit(1765887471.639:528): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:17:51.651000 audit[3348]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3348 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:17:51.651000 audit[3348]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffee83fc80 a2=0 a3=1 items=0 ppid=3084 pid=3348 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:51.657949 kernel: audit: type=1325 audit(1765887471.651:529): table=nat:110 family=2 entries=12 op=nft_register_rule pid=3348 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:17:51.658071 kernel: audit: type=1300 audit(1765887471.651:529): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffee83fc80 a2=0 a3=1 items=0 ppid=3084 pid=3348 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:51.658142 kernel: audit: type=1327 audit(1765887471.651:529): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:17:51.651000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:17:52.666000 audit[3351]: NETFILTER_CFG table=filter:111 family=2 entries=18 op=nft_register_rule pid=3351 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:17:52.666000 audit[3351]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=fffff94aab00 a2=0 a3=1 items=0 ppid=3084 pid=3351 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:52.674128 kernel: audit: type=1325 audit(1765887472.666:530): table=filter:111 family=2 entries=18 op=nft_register_rule pid=3351 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:17:52.674231 kernel: audit: type=1300 audit(1765887472.666:530): arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=fffff94aab00 a2=0 a3=1 items=0 ppid=3084 pid=3351 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:52.666000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:17:52.675945 kernel: audit: type=1327 audit(1765887472.666:530): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:17:52.675000 audit[3351]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3351 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:17:52.675000 audit[3351]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff94aab00 a2=0 a3=1 items=0 ppid=3084 pid=3351 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:52.675000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:17:52.679848 kernel: audit: type=1325 audit(1765887472.675:531): table=nat:112 family=2 entries=12 op=nft_register_rule pid=3351 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:17:53.694000 audit[3353]: NETFILTER_CFG table=filter:113 family=2 entries=19 op=nft_register_rule pid=3353 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:17:53.694000 audit[3353]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffed2319d0 a2=0 a3=1 items=0 ppid=3084 pid=3353 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:53.694000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:17:53.700000 audit[3353]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3353 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:17:53.700000 audit[3353]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffed2319d0 a2=0 a3=1 items=0 ppid=3084 pid=3353 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:53.700000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:17:54.222363 systemd[1]: Created slice kubepods-besteffort-poddbf2e4b4_1670_4579_ba1d_0f8d17dcf954.slice - libcontainer container kubepods-besteffort-poddbf2e4b4_1670_4579_ba1d_0f8d17dcf954.slice. Dec 16 12:17:54.233058 kubelet[2932]: I1216 12:17:54.232798 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ccsjr\" (UniqueName: \"kubernetes.io/projected/dbf2e4b4-1670-4579-ba1d-0f8d17dcf954-kube-api-access-ccsjr\") pod \"calico-typha-6b58cfcf7f-6gjv5\" (UID: \"dbf2e4b4-1670-4579-ba1d-0f8d17dcf954\") " pod="calico-system/calico-typha-6b58cfcf7f-6gjv5" Dec 16 12:17:54.233766 kubelet[2932]: I1216 12:17:54.233718 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/dbf2e4b4-1670-4579-ba1d-0f8d17dcf954-tigera-ca-bundle\") pod \"calico-typha-6b58cfcf7f-6gjv5\" (UID: \"dbf2e4b4-1670-4579-ba1d-0f8d17dcf954\") " pod="calico-system/calico-typha-6b58cfcf7f-6gjv5" Dec 16 12:17:54.233912 kubelet[2932]: I1216 12:17:54.233804 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/dbf2e4b4-1670-4579-ba1d-0f8d17dcf954-typha-certs\") pod \"calico-typha-6b58cfcf7f-6gjv5\" (UID: \"dbf2e4b4-1670-4579-ba1d-0f8d17dcf954\") " pod="calico-system/calico-typha-6b58cfcf7f-6gjv5" Dec 16 12:17:54.401685 systemd[1]: Created slice kubepods-besteffort-pod824e61dc_e283_4a90_ae46_ce4128bb9506.slice - libcontainer container kubepods-besteffort-pod824e61dc_e283_4a90_ae46_ce4128bb9506.slice. Dec 16 12:17:54.434442 kubelet[2932]: I1216 12:17:54.434402 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/824e61dc-e283-4a90-ae46-ce4128bb9506-cni-log-dir\") pod \"calico-node-nmpjq\" (UID: \"824e61dc-e283-4a90-ae46-ce4128bb9506\") " pod="calico-system/calico-node-nmpjq" Dec 16 12:17:54.434442 kubelet[2932]: I1216 12:17:54.434444 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/824e61dc-e283-4a90-ae46-ce4128bb9506-var-run-calico\") pod \"calico-node-nmpjq\" (UID: \"824e61dc-e283-4a90-ae46-ce4128bb9506\") " pod="calico-system/calico-node-nmpjq" Dec 16 12:17:54.434610 kubelet[2932]: I1216 12:17:54.434461 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/824e61dc-e283-4a90-ae46-ce4128bb9506-xtables-lock\") pod \"calico-node-nmpjq\" (UID: \"824e61dc-e283-4a90-ae46-ce4128bb9506\") " pod="calico-system/calico-node-nmpjq" Dec 16 12:17:54.435191 kubelet[2932]: I1216 12:17:54.434614 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/824e61dc-e283-4a90-ae46-ce4128bb9506-flexvol-driver-host\") pod \"calico-node-nmpjq\" (UID: \"824e61dc-e283-4a90-ae46-ce4128bb9506\") " pod="calico-system/calico-node-nmpjq" Dec 16 12:17:54.435191 kubelet[2932]: I1216 12:17:54.434770 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/824e61dc-e283-4a90-ae46-ce4128bb9506-lib-modules\") pod \"calico-node-nmpjq\" (UID: \"824e61dc-e283-4a90-ae46-ce4128bb9506\") " pod="calico-system/calico-node-nmpjq" Dec 16 12:17:54.435191 kubelet[2932]: I1216 12:17:54.434797 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/824e61dc-e283-4a90-ae46-ce4128bb9506-policysync\") pod \"calico-node-nmpjq\" (UID: \"824e61dc-e283-4a90-ae46-ce4128bb9506\") " pod="calico-system/calico-node-nmpjq" Dec 16 12:17:54.435191 kubelet[2932]: I1216 12:17:54.435009 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/824e61dc-e283-4a90-ae46-ce4128bb9506-tigera-ca-bundle\") pod \"calico-node-nmpjq\" (UID: \"824e61dc-e283-4a90-ae46-ce4128bb9506\") " pod="calico-system/calico-node-nmpjq" Dec 16 12:17:54.435191 kubelet[2932]: I1216 12:17:54.435033 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/824e61dc-e283-4a90-ae46-ce4128bb9506-var-lib-calico\") pod \"calico-node-nmpjq\" (UID: \"824e61dc-e283-4a90-ae46-ce4128bb9506\") " pod="calico-system/calico-node-nmpjq" Dec 16 12:17:54.435446 kubelet[2932]: I1216 12:17:54.435047 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6hkf\" (UniqueName: \"kubernetes.io/projected/824e61dc-e283-4a90-ae46-ce4128bb9506-kube-api-access-d6hkf\") pod \"calico-node-nmpjq\" (UID: \"824e61dc-e283-4a90-ae46-ce4128bb9506\") " pod="calico-system/calico-node-nmpjq" Dec 16 12:17:54.435446 kubelet[2932]: I1216 12:17:54.435101 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/824e61dc-e283-4a90-ae46-ce4128bb9506-node-certs\") pod \"calico-node-nmpjq\" (UID: \"824e61dc-e283-4a90-ae46-ce4128bb9506\") " pod="calico-system/calico-node-nmpjq" Dec 16 12:17:54.435446 kubelet[2932]: I1216 12:17:54.435179 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/824e61dc-e283-4a90-ae46-ce4128bb9506-cni-bin-dir\") pod \"calico-node-nmpjq\" (UID: \"824e61dc-e283-4a90-ae46-ce4128bb9506\") " pod="calico-system/calico-node-nmpjq" Dec 16 12:17:54.435446 kubelet[2932]: I1216 12:17:54.435210 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/824e61dc-e283-4a90-ae46-ce4128bb9506-cni-net-dir\") pod \"calico-node-nmpjq\" (UID: \"824e61dc-e283-4a90-ae46-ce4128bb9506\") " pod="calico-system/calico-node-nmpjq" Dec 16 12:17:54.529540 containerd[1662]: time="2025-12-16T12:17:54.529438818Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6b58cfcf7f-6gjv5,Uid:dbf2e4b4-1670-4579-ba1d-0f8d17dcf954,Namespace:calico-system,Attempt:0,}" Dec 16 12:17:54.537700 kubelet[2932]: E1216 12:17:54.537064 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.537700 kubelet[2932]: W1216 12:17:54.537202 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.537700 kubelet[2932]: E1216 12:17:54.537222 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.537700 kubelet[2932]: E1216 12:17:54.537645 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.537700 kubelet[2932]: W1216 12:17:54.537657 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.537700 kubelet[2932]: E1216 12:17:54.537695 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.538055 kubelet[2932]: E1216 12:17:54.538033 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.538055 kubelet[2932]: W1216 12:17:54.538050 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.538195 kubelet[2932]: E1216 12:17:54.538061 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.538636 kubelet[2932]: E1216 12:17:54.538553 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.538636 kubelet[2932]: W1216 12:17:54.538567 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.538636 kubelet[2932]: E1216 12:17:54.538577 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.539054 kubelet[2932]: E1216 12:17:54.539007 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.539054 kubelet[2932]: W1216 12:17:54.539024 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.539054 kubelet[2932]: E1216 12:17:54.539053 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.539788 kubelet[2932]: E1216 12:17:54.539748 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.539788 kubelet[2932]: W1216 12:17:54.539766 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.539870 kubelet[2932]: E1216 12:17:54.539797 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.540410 kubelet[2932]: E1216 12:17:54.540377 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.540410 kubelet[2932]: W1216 12:17:54.540395 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.540410 kubelet[2932]: E1216 12:17:54.540409 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.541243 kubelet[2932]: E1216 12:17:54.541143 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.541280 kubelet[2932]: W1216 12:17:54.541244 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.541280 kubelet[2932]: E1216 12:17:54.541256 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.541662 kubelet[2932]: E1216 12:17:54.541645 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.541699 kubelet[2932]: W1216 12:17:54.541681 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.541699 kubelet[2932]: E1216 12:17:54.541695 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.542350 kubelet[2932]: E1216 12:17:54.542290 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.542350 kubelet[2932]: W1216 12:17:54.542329 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.542408 kubelet[2932]: E1216 12:17:54.542359 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.542806 kubelet[2932]: E1216 12:17:54.542787 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.542852 kubelet[2932]: W1216 12:17:54.542804 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.543027 kubelet[2932]: E1216 12:17:54.543005 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.543384 kubelet[2932]: E1216 12:17:54.543367 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.543427 kubelet[2932]: W1216 12:17:54.543385 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.543427 kubelet[2932]: E1216 12:17:54.543396 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.543668 kubelet[2932]: E1216 12:17:54.543654 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.543668 kubelet[2932]: W1216 12:17:54.543667 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.543725 kubelet[2932]: E1216 12:17:54.543678 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.543843 kubelet[2932]: E1216 12:17:54.543830 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.543843 kubelet[2932]: W1216 12:17:54.543841 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.543897 kubelet[2932]: E1216 12:17:54.543849 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.544046 kubelet[2932]: E1216 12:17:54.544032 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.544046 kubelet[2932]: W1216 12:17:54.544043 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.544101 kubelet[2932]: E1216 12:17:54.544051 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.544217 kubelet[2932]: E1216 12:17:54.544204 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.544217 kubelet[2932]: W1216 12:17:54.544216 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.544268 kubelet[2932]: E1216 12:17:54.544225 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.544356 kubelet[2932]: E1216 12:17:54.544342 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.544356 kubelet[2932]: W1216 12:17:54.544354 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.544406 kubelet[2932]: E1216 12:17:54.544362 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.544480 kubelet[2932]: E1216 12:17:54.544469 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.544504 kubelet[2932]: W1216 12:17:54.544479 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.544504 kubelet[2932]: E1216 12:17:54.544487 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.544593 kubelet[2932]: E1216 12:17:54.544582 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.544617 kubelet[2932]: W1216 12:17:54.544593 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.544617 kubelet[2932]: E1216 12:17:54.544601 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.544780 kubelet[2932]: E1216 12:17:54.544766 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.544780 kubelet[2932]: W1216 12:17:54.544778 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.544848 kubelet[2932]: E1216 12:17:54.544786 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.544957 kubelet[2932]: E1216 12:17:54.544941 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.544957 kubelet[2932]: W1216 12:17:54.544956 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.545007 kubelet[2932]: E1216 12:17:54.544964 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.545163 kubelet[2932]: E1216 12:17:54.545149 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.545163 kubelet[2932]: W1216 12:17:54.545161 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.545213 kubelet[2932]: E1216 12:17:54.545170 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.545298 kubelet[2932]: E1216 12:17:54.545285 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.545298 kubelet[2932]: W1216 12:17:54.545296 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.545344 kubelet[2932]: E1216 12:17:54.545304 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.545453 kubelet[2932]: E1216 12:17:54.545442 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.545453 kubelet[2932]: W1216 12:17:54.545452 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.545503 kubelet[2932]: E1216 12:17:54.545459 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.545591 kubelet[2932]: E1216 12:17:54.545578 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.545591 kubelet[2932]: W1216 12:17:54.545589 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.545644 kubelet[2932]: E1216 12:17:54.545597 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.545740 kubelet[2932]: E1216 12:17:54.545728 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.545764 kubelet[2932]: W1216 12:17:54.545740 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.545764 kubelet[2932]: E1216 12:17:54.545748 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.546045 kubelet[2932]: E1216 12:17:54.546004 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.546045 kubelet[2932]: W1216 12:17:54.546029 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.546045 kubelet[2932]: E1216 12:17:54.546042 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.546221 kubelet[2932]: E1216 12:17:54.546208 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.546221 kubelet[2932]: W1216 12:17:54.546219 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.546426 kubelet[2932]: E1216 12:17:54.546227 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.546426 kubelet[2932]: E1216 12:17:54.546364 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.546426 kubelet[2932]: W1216 12:17:54.546371 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.546426 kubelet[2932]: E1216 12:17:54.546379 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.546527 kubelet[2932]: E1216 12:17:54.546520 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.546549 kubelet[2932]: W1216 12:17:54.546527 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.546549 kubelet[2932]: E1216 12:17:54.546535 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.550115 kubelet[2932]: E1216 12:17:54.550052 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.550115 kubelet[2932]: W1216 12:17:54.550075 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.550115 kubelet[2932]: E1216 12:17:54.550088 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.552852 kubelet[2932]: E1216 12:17:54.552660 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.552852 kubelet[2932]: W1216 12:17:54.552681 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.552852 kubelet[2932]: E1216 12:17:54.552697 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.553255 containerd[1662]: time="2025-12-16T12:17:54.553208174Z" level=info msg="connecting to shim 985c3c6ae3f16d3b9bca5f2f6f158a8228d253b0e37ca23c7325df3b36573541" address="unix:///run/containerd/s/f297a0637e031ccddbba8ed4c45a3f4806ea7ff8220062ed213c70f30817cfed" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:17:54.581081 systemd[1]: Started cri-containerd-985c3c6ae3f16d3b9bca5f2f6f158a8228d253b0e37ca23c7325df3b36573541.scope - libcontainer container 985c3c6ae3f16d3b9bca5f2f6f158a8228d253b0e37ca23c7325df3b36573541. Dec 16 12:17:54.597185 kubelet[2932]: E1216 12:17:54.597139 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tp52x" podUID="1d6077b5-49a5-4d21-bd6b-0ffae41c4da0" Dec 16 12:17:54.623936 kubelet[2932]: E1216 12:17:54.623893 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.623936 kubelet[2932]: W1216 12:17:54.623924 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.623936 kubelet[2932]: E1216 12:17:54.623946 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.624204 kubelet[2932]: E1216 12:17:54.624179 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.624204 kubelet[2932]: W1216 12:17:54.624195 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.624269 kubelet[2932]: E1216 12:17:54.624206 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.624373 kubelet[2932]: E1216 12:17:54.624353 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.624373 kubelet[2932]: W1216 12:17:54.624367 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.624426 kubelet[2932]: E1216 12:17:54.624377 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.624541 kubelet[2932]: E1216 12:17:54.624522 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.624541 kubelet[2932]: W1216 12:17:54.624536 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.624598 kubelet[2932]: E1216 12:17:54.624544 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.624771 kubelet[2932]: E1216 12:17:54.624736 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.624771 kubelet[2932]: W1216 12:17:54.624750 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.624771 kubelet[2932]: E1216 12:17:54.624759 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.626955 kubelet[2932]: E1216 12:17:54.625655 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.626955 kubelet[2932]: W1216 12:17:54.625664 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.626955 kubelet[2932]: E1216 12:17:54.625673 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.626955 kubelet[2932]: E1216 12:17:54.625838 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.626955 kubelet[2932]: W1216 12:17:54.625846 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.626955 kubelet[2932]: E1216 12:17:54.625854 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.626955 kubelet[2932]: E1216 12:17:54.626002 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.626955 kubelet[2932]: W1216 12:17:54.626014 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.626955 kubelet[2932]: E1216 12:17:54.626022 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.626955 kubelet[2932]: E1216 12:17:54.626213 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.627167 kubelet[2932]: W1216 12:17:54.626221 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.627167 kubelet[2932]: E1216 12:17:54.626229 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.627167 kubelet[2932]: E1216 12:17:54.626368 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.627167 kubelet[2932]: W1216 12:17:54.626375 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.627167 kubelet[2932]: E1216 12:17:54.626383 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.627167 kubelet[2932]: E1216 12:17:54.626513 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.627167 kubelet[2932]: W1216 12:17:54.626520 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.627167 kubelet[2932]: E1216 12:17:54.626527 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.627167 kubelet[2932]: E1216 12:17:54.626643 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.627167 kubelet[2932]: W1216 12:17:54.626660 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.627395 kubelet[2932]: E1216 12:17:54.626668 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.627395 kubelet[2932]: E1216 12:17:54.626802 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.627395 kubelet[2932]: W1216 12:17:54.626899 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.627395 kubelet[2932]: E1216 12:17:54.626915 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.627395 kubelet[2932]: E1216 12:17:54.627079 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.627395 kubelet[2932]: W1216 12:17:54.627089 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.627395 kubelet[2932]: E1216 12:17:54.627098 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.627395 kubelet[2932]: E1216 12:17:54.627267 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.627395 kubelet[2932]: W1216 12:17:54.627275 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.627395 kubelet[2932]: E1216 12:17:54.627284 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.627578 kubelet[2932]: E1216 12:17:54.627431 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.627578 kubelet[2932]: W1216 12:17:54.627439 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.627578 kubelet[2932]: E1216 12:17:54.627446 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.627635 kubelet[2932]: E1216 12:17:54.627622 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.627635 kubelet[2932]: W1216 12:17:54.627630 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.627713 kubelet[2932]: E1216 12:17:54.627638 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.627997 kubelet[2932]: E1216 12:17:54.627981 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.627997 kubelet[2932]: W1216 12:17:54.627993 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.628081 kubelet[2932]: E1216 12:17:54.628004 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.628173 kubelet[2932]: E1216 12:17:54.628160 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.628203 kubelet[2932]: W1216 12:17:54.628175 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.628203 kubelet[2932]: E1216 12:17:54.628184 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.628589 kubelet[2932]: E1216 12:17:54.628569 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.628589 kubelet[2932]: W1216 12:17:54.628584 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.628673 kubelet[2932]: E1216 12:17:54.628619 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.638840 kubelet[2932]: E1216 12:17:54.637255 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.638840 kubelet[2932]: W1216 12:17:54.637278 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.638840 kubelet[2932]: E1216 12:17:54.637295 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.638840 kubelet[2932]: I1216 12:17:54.637322 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kxzwz\" (UniqueName: \"kubernetes.io/projected/1d6077b5-49a5-4d21-bd6b-0ffae41c4da0-kube-api-access-kxzwz\") pod \"csi-node-driver-tp52x\" (UID: \"1d6077b5-49a5-4d21-bd6b-0ffae41c4da0\") " pod="calico-system/csi-node-driver-tp52x" Dec 16 12:17:54.639067 kubelet[2932]: E1216 12:17:54.638911 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.639067 kubelet[2932]: W1216 12:17:54.638927 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.639067 kubelet[2932]: E1216 12:17:54.638942 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.639067 kubelet[2932]: I1216 12:17:54.638968 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/1d6077b5-49a5-4d21-bd6b-0ffae41c4da0-registration-dir\") pod \"csi-node-driver-tp52x\" (UID: \"1d6077b5-49a5-4d21-bd6b-0ffae41c4da0\") " pod="calico-system/csi-node-driver-tp52x" Dec 16 12:17:54.639192 kubelet[2932]: E1216 12:17:54.639148 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.639192 kubelet[2932]: W1216 12:17:54.639189 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.639242 kubelet[2932]: E1216 12:17:54.639201 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.639302 kubelet[2932]: I1216 12:17:54.639280 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/1d6077b5-49a5-4d21-bd6b-0ffae41c4da0-socket-dir\") pod \"csi-node-driver-tp52x\" (UID: \"1d6077b5-49a5-4d21-bd6b-0ffae41c4da0\") " pod="calico-system/csi-node-driver-tp52x" Dec 16 12:17:54.639430 kubelet[2932]: E1216 12:17:54.639412 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.639430 kubelet[2932]: W1216 12:17:54.639424 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.637000 audit: BPF prog-id=151 op=LOAD Dec 16 12:17:54.639550 kubelet[2932]: E1216 12:17:54.639433 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.639774 kubelet[2932]: E1216 12:17:54.639740 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.639774 kubelet[2932]: W1216 12:17:54.639756 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.639774 kubelet[2932]: E1216 12:17:54.639768 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.638000 audit: BPF prog-id=152 op=LOAD Dec 16 12:17:54.638000 audit[3409]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3397 pid=3409 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:54.638000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3938356333633661653366313664336239626361356632663666313538 Dec 16 12:17:54.638000 audit: BPF prog-id=152 op=UNLOAD Dec 16 12:17:54.638000 audit[3409]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3397 pid=3409 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:54.638000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3938356333633661653366313664336239626361356632663666313538 Dec 16 12:17:54.638000 audit: BPF prog-id=153 op=LOAD Dec 16 12:17:54.638000 audit[3409]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3397 pid=3409 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:54.638000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3938356333633661653366313664336239626361356632663666313538 Dec 16 12:17:54.638000 audit: BPF prog-id=154 op=LOAD Dec 16 12:17:54.638000 audit[3409]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3397 pid=3409 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:54.638000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3938356333633661653366313664336239626361356632663666313538 Dec 16 12:17:54.638000 audit: BPF prog-id=154 op=UNLOAD Dec 16 12:17:54.638000 audit[3409]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3397 pid=3409 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:54.638000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3938356333633661653366313664336239626361356632663666313538 Dec 16 12:17:54.638000 audit: BPF prog-id=153 op=UNLOAD Dec 16 12:17:54.638000 audit[3409]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3397 pid=3409 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:54.638000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3938356333633661653366313664336239626361356632663666313538 Dec 16 12:17:54.638000 audit: BPF prog-id=155 op=LOAD Dec 16 12:17:54.638000 audit[3409]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3397 pid=3409 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:54.638000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3938356333633661653366313664336239626361356632663666313538 Dec 16 12:17:54.640735 kubelet[2932]: E1216 12:17:54.639986 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.640735 kubelet[2932]: W1216 12:17:54.639996 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.640735 kubelet[2932]: E1216 12:17:54.640005 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.640735 kubelet[2932]: I1216 12:17:54.640032 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/1d6077b5-49a5-4d21-bd6b-0ffae41c4da0-varrun\") pod \"csi-node-driver-tp52x\" (UID: \"1d6077b5-49a5-4d21-bd6b-0ffae41c4da0\") " pod="calico-system/csi-node-driver-tp52x" Dec 16 12:17:54.640735 kubelet[2932]: E1216 12:17:54.640212 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.640735 kubelet[2932]: W1216 12:17:54.640221 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.640735 kubelet[2932]: E1216 12:17:54.640230 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.640735 kubelet[2932]: E1216 12:17:54.640401 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.640735 kubelet[2932]: W1216 12:17:54.640410 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.641003 kubelet[2932]: E1216 12:17:54.640418 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.641003 kubelet[2932]: E1216 12:17:54.640960 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.641003 kubelet[2932]: W1216 12:17:54.640973 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.641003 kubelet[2932]: E1216 12:17:54.640985 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.642211 kubelet[2932]: E1216 12:17:54.641156 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.642211 kubelet[2932]: W1216 12:17:54.641170 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.642211 kubelet[2932]: E1216 12:17:54.641179 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.642211 kubelet[2932]: E1216 12:17:54.641449 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.642211 kubelet[2932]: W1216 12:17:54.641463 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.642211 kubelet[2932]: E1216 12:17:54.641476 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.642211 kubelet[2932]: E1216 12:17:54.641768 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.642211 kubelet[2932]: W1216 12:17:54.641780 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.642211 kubelet[2932]: E1216 12:17:54.641790 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.642446 kubelet[2932]: I1216 12:17:54.641951 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/1d6077b5-49a5-4d21-bd6b-0ffae41c4da0-kubelet-dir\") pod \"csi-node-driver-tp52x\" (UID: \"1d6077b5-49a5-4d21-bd6b-0ffae41c4da0\") " pod="calico-system/csi-node-driver-tp52x" Dec 16 12:17:54.642446 kubelet[2932]: E1216 12:17:54.642050 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.642446 kubelet[2932]: W1216 12:17:54.642058 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.642446 kubelet[2932]: E1216 12:17:54.642067 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.644041 kubelet[2932]: E1216 12:17:54.644008 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.644041 kubelet[2932]: W1216 12:17:54.644027 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.644041 kubelet[2932]: E1216 12:17:54.644044 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.644293 kubelet[2932]: E1216 12:17:54.644267 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.644293 kubelet[2932]: W1216 12:17:54.644280 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.644293 kubelet[2932]: E1216 12:17:54.644291 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.674335 containerd[1662]: time="2025-12-16T12:17:54.674295202Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-6b58cfcf7f-6gjv5,Uid:dbf2e4b4-1670-4579-ba1d-0f8d17dcf954,Namespace:calico-system,Attempt:0,} returns sandbox id \"985c3c6ae3f16d3b9bca5f2f6f158a8228d253b0e37ca23c7325df3b36573541\"" Dec 16 12:17:54.676430 containerd[1662]: time="2025-12-16T12:17:54.676262848Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Dec 16 12:17:54.706746 containerd[1662]: time="2025-12-16T12:17:54.706707346Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-nmpjq,Uid:824e61dc-e283-4a90-ae46-ce4128bb9506,Namespace:calico-system,Attempt:0,}" Dec 16 12:17:54.719000 audit[3482]: NETFILTER_CFG table=filter:115 family=2 entries=21 op=nft_register_rule pid=3482 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:17:54.719000 audit[3482]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=fffff3878b90 a2=0 a3=1 items=0 ppid=3084 pid=3482 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:54.719000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:17:54.730000 audit[3482]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3482 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:17:54.730000 audit[3482]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff3878b90 a2=0 a3=1 items=0 ppid=3084 pid=3482 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:54.730000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:17:54.736687 containerd[1662]: time="2025-12-16T12:17:54.736619082Z" level=info msg="connecting to shim 4642879d07dc39ea137bdc2c3f8993ff6e427fe9c17da3a821e2fd40781df02a" address="unix:///run/containerd/s/0ac6c768bdd4270cb5a4414d39a56532d892357106adac15928f12a56512b775" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:17:54.742405 kubelet[2932]: E1216 12:17:54.742360 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.742405 kubelet[2932]: W1216 12:17:54.742388 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.742596 kubelet[2932]: E1216 12:17:54.742408 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.743349 kubelet[2932]: E1216 12:17:54.743276 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.743654 kubelet[2932]: W1216 12:17:54.743419 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.743654 kubelet[2932]: E1216 12:17:54.743436 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.743877 kubelet[2932]: E1216 12:17:54.743861 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.744171 kubelet[2932]: W1216 12:17:54.743967 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.744284 kubelet[2932]: E1216 12:17:54.744268 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.745244 kubelet[2932]: E1216 12:17:54.745227 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.745732 kubelet[2932]: W1216 12:17:54.745592 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.745732 kubelet[2932]: E1216 12:17:54.745618 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.746476 kubelet[2932]: E1216 12:17:54.745915 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.746568 kubelet[2932]: W1216 12:17:54.746551 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.746637 kubelet[2932]: E1216 12:17:54.746625 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.746984 kubelet[2932]: E1216 12:17:54.746914 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.746984 kubelet[2932]: W1216 12:17:54.746928 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.746984 kubelet[2932]: E1216 12:17:54.746939 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.747479 kubelet[2932]: E1216 12:17:54.747460 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.747479 kubelet[2932]: W1216 12:17:54.747479 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.747536 kubelet[2932]: E1216 12:17:54.747492 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.747720 kubelet[2932]: E1216 12:17:54.747707 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.747720 kubelet[2932]: W1216 12:17:54.747719 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.747889 kubelet[2932]: E1216 12:17:54.747730 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.748187 kubelet[2932]: E1216 12:17:54.748122 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.748187 kubelet[2932]: W1216 12:17:54.748133 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.748187 kubelet[2932]: E1216 12:17:54.748142 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.748901 kubelet[2932]: E1216 12:17:54.748878 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.748901 kubelet[2932]: W1216 12:17:54.748899 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.748967 kubelet[2932]: E1216 12:17:54.748911 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.749549 kubelet[2932]: E1216 12:17:54.749529 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.749549 kubelet[2932]: W1216 12:17:54.749545 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.749631 kubelet[2932]: E1216 12:17:54.749557 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.749792 kubelet[2932]: E1216 12:17:54.749764 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.749792 kubelet[2932]: W1216 12:17:54.749782 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.749792 kubelet[2932]: E1216 12:17:54.749795 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.750375 kubelet[2932]: E1216 12:17:54.750355 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.750375 kubelet[2932]: W1216 12:17:54.750367 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.750424 kubelet[2932]: E1216 12:17:54.750378 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.750805 kubelet[2932]: E1216 12:17:54.750727 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.750805 kubelet[2932]: W1216 12:17:54.750744 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.750805 kubelet[2932]: E1216 12:17:54.750760 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.751912 kubelet[2932]: E1216 12:17:54.751623 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.751912 kubelet[2932]: W1216 12:17:54.751644 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.751912 kubelet[2932]: E1216 12:17:54.751658 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.752443 kubelet[2932]: E1216 12:17:54.752349 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.752443 kubelet[2932]: W1216 12:17:54.752365 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.752443 kubelet[2932]: E1216 12:17:54.752379 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.753858 kubelet[2932]: E1216 12:17:54.752961 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.753858 kubelet[2932]: W1216 12:17:54.752978 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.753858 kubelet[2932]: E1216 12:17:54.752992 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.753858 kubelet[2932]: E1216 12:17:54.753287 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.753858 kubelet[2932]: W1216 12:17:54.753299 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.753858 kubelet[2932]: E1216 12:17:54.753309 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.753858 kubelet[2932]: E1216 12:17:54.753573 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.753858 kubelet[2932]: W1216 12:17:54.753586 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.753858 kubelet[2932]: E1216 12:17:54.753596 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.755922 kubelet[2932]: E1216 12:17:54.755875 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.755922 kubelet[2932]: W1216 12:17:54.755898 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.756572 kubelet[2932]: E1216 12:17:54.755934 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.756572 kubelet[2932]: E1216 12:17:54.756556 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.756572 kubelet[2932]: W1216 12:17:54.756571 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.756678 kubelet[2932]: E1216 12:17:54.756584 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.758885 kubelet[2932]: E1216 12:17:54.758855 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.758885 kubelet[2932]: W1216 12:17:54.758877 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.758976 kubelet[2932]: E1216 12:17:54.758894 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.759849 kubelet[2932]: E1216 12:17:54.759653 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.759849 kubelet[2932]: W1216 12:17:54.759671 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.759849 kubelet[2932]: E1216 12:17:54.759685 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.760166 kubelet[2932]: E1216 12:17:54.760016 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.760166 kubelet[2932]: W1216 12:17:54.760031 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.760166 kubelet[2932]: E1216 12:17:54.760042 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.760310 kubelet[2932]: E1216 12:17:54.760298 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.760361 kubelet[2932]: W1216 12:17:54.760350 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.760410 kubelet[2932]: E1216 12:17:54.760400 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.765008 kubelet[2932]: E1216 12:17:54.764986 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:54.765008 kubelet[2932]: W1216 12:17:54.765004 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:54.765104 kubelet[2932]: E1216 12:17:54.765019 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:54.770995 systemd[1]: Started cri-containerd-4642879d07dc39ea137bdc2c3f8993ff6e427fe9c17da3a821e2fd40781df02a.scope - libcontainer container 4642879d07dc39ea137bdc2c3f8993ff6e427fe9c17da3a821e2fd40781df02a. Dec 16 12:17:54.778000 audit: BPF prog-id=156 op=LOAD Dec 16 12:17:54.778000 audit: BPF prog-id=157 op=LOAD Dec 16 12:17:54.778000 audit[3505]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3491 pid=3505 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:54.778000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436343238373964303764633339656131333762646332633366383939 Dec 16 12:17:54.778000 audit: BPF prog-id=157 op=UNLOAD Dec 16 12:17:54.778000 audit[3505]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3491 pid=3505 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:54.778000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436343238373964303764633339656131333762646332633366383939 Dec 16 12:17:54.779000 audit: BPF prog-id=158 op=LOAD Dec 16 12:17:54.779000 audit[3505]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3491 pid=3505 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:54.779000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436343238373964303764633339656131333762646332633366383939 Dec 16 12:17:54.779000 audit: BPF prog-id=159 op=LOAD Dec 16 12:17:54.779000 audit[3505]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3491 pid=3505 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:54.779000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436343238373964303764633339656131333762646332633366383939 Dec 16 12:17:54.779000 audit: BPF prog-id=159 op=UNLOAD Dec 16 12:17:54.779000 audit[3505]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3491 pid=3505 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:54.779000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436343238373964303764633339656131333762646332633366383939 Dec 16 12:17:54.779000 audit: BPF prog-id=158 op=UNLOAD Dec 16 12:17:54.779000 audit[3505]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3491 pid=3505 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:54.779000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436343238373964303764633339656131333762646332633366383939 Dec 16 12:17:54.779000 audit: BPF prog-id=160 op=LOAD Dec 16 12:17:54.779000 audit[3505]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3491 pid=3505 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:54.779000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436343238373964303764633339656131333762646332633366383939 Dec 16 12:17:54.796377 containerd[1662]: time="2025-12-16T12:17:54.796341273Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-nmpjq,Uid:824e61dc-e283-4a90-ae46-ce4128bb9506,Namespace:calico-system,Attempt:0,} returns sandbox id \"4642879d07dc39ea137bdc2c3f8993ff6e427fe9c17da3a821e2fd40781df02a\"" Dec 16 12:17:55.942679 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3476528903.mount: Deactivated successfully. Dec 16 12:17:56.109330 kubelet[2932]: E1216 12:17:56.109276 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tp52x" podUID="1d6077b5-49a5-4d21-bd6b-0ffae41c4da0" Dec 16 12:17:56.352193 containerd[1662]: time="2025-12-16T12:17:56.352079016Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:56.353715 containerd[1662]: time="2025-12-16T12:17:56.353537021Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=31716861" Dec 16 12:17:56.354654 containerd[1662]: time="2025-12-16T12:17:56.354613544Z" level=info msg="ImageCreate event name:\"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:56.357504 containerd[1662]: time="2025-12-16T12:17:56.357242593Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:56.358605 containerd[1662]: time="2025-12-16T12:17:56.358573317Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"33090541\" in 1.682271988s" Dec 16 12:17:56.358730 containerd[1662]: time="2025-12-16T12:17:56.358715317Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\"" Dec 16 12:17:56.360255 containerd[1662]: time="2025-12-16T12:17:56.359922321Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Dec 16 12:17:56.373501 containerd[1662]: time="2025-12-16T12:17:56.373047363Z" level=info msg="CreateContainer within sandbox \"985c3c6ae3f16d3b9bca5f2f6f158a8228d253b0e37ca23c7325df3b36573541\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Dec 16 12:17:56.384974 containerd[1662]: time="2025-12-16T12:17:56.384935681Z" level=info msg="Container f11aaf5d9ca172e3e4d2fd370b237068442441ed4d49c595d27cc0e1fd7b967b: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:17:56.389567 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4294309411.mount: Deactivated successfully. Dec 16 12:17:56.395600 containerd[1662]: time="2025-12-16T12:17:56.395527755Z" level=info msg="CreateContainer within sandbox \"985c3c6ae3f16d3b9bca5f2f6f158a8228d253b0e37ca23c7325df3b36573541\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"f11aaf5d9ca172e3e4d2fd370b237068442441ed4d49c595d27cc0e1fd7b967b\"" Dec 16 12:17:56.396126 containerd[1662]: time="2025-12-16T12:17:56.396080117Z" level=info msg="StartContainer for \"f11aaf5d9ca172e3e4d2fd370b237068442441ed4d49c595d27cc0e1fd7b967b\"" Dec 16 12:17:56.397528 containerd[1662]: time="2025-12-16T12:17:56.397473442Z" level=info msg="connecting to shim f11aaf5d9ca172e3e4d2fd370b237068442441ed4d49c595d27cc0e1fd7b967b" address="unix:///run/containerd/s/f297a0637e031ccddbba8ed4c45a3f4806ea7ff8220062ed213c70f30817cfed" protocol=ttrpc version=3 Dec 16 12:17:56.421037 systemd[1]: Started cri-containerd-f11aaf5d9ca172e3e4d2fd370b237068442441ed4d49c595d27cc0e1fd7b967b.scope - libcontainer container f11aaf5d9ca172e3e4d2fd370b237068442441ed4d49c595d27cc0e1fd7b967b. Dec 16 12:17:56.430000 audit: BPF prog-id=161 op=LOAD Dec 16 12:17:56.431000 audit: BPF prog-id=162 op=LOAD Dec 16 12:17:56.431000 audit[3565]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=3397 pid=3565 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:56.431000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631316161663564396361313732653365346432666433373062323337 Dec 16 12:17:56.431000 audit: BPF prog-id=162 op=UNLOAD Dec 16 12:17:56.431000 audit[3565]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3397 pid=3565 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:56.431000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631316161663564396361313732653365346432666433373062323337 Dec 16 12:17:56.431000 audit: BPF prog-id=163 op=LOAD Dec 16 12:17:56.431000 audit[3565]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=3397 pid=3565 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:56.431000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631316161663564396361313732653365346432666433373062323337 Dec 16 12:17:56.431000 audit: BPF prog-id=164 op=LOAD Dec 16 12:17:56.431000 audit[3565]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=3397 pid=3565 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:56.431000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631316161663564396361313732653365346432666433373062323337 Dec 16 12:17:56.431000 audit: BPF prog-id=164 op=UNLOAD Dec 16 12:17:56.431000 audit[3565]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3397 pid=3565 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:56.431000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631316161663564396361313732653365346432666433373062323337 Dec 16 12:17:56.431000 audit: BPF prog-id=163 op=UNLOAD Dec 16 12:17:56.431000 audit[3565]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3397 pid=3565 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:56.431000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631316161663564396361313732653365346432666433373062323337 Dec 16 12:17:56.431000 audit: BPF prog-id=165 op=LOAD Dec 16 12:17:56.431000 audit[3565]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=3397 pid=3565 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:56.431000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6631316161663564396361313732653365346432666433373062323337 Dec 16 12:17:56.457541 containerd[1662]: time="2025-12-16T12:17:56.456749591Z" level=info msg="StartContainer for \"f11aaf5d9ca172e3e4d2fd370b237068442441ed4d49c595d27cc0e1fd7b967b\" returns successfully" Dec 16 12:17:57.190588 kubelet[2932]: I1216 12:17:57.190526 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-6b58cfcf7f-6gjv5" podStartSLOduration=1.5066459079999999 podStartE2EDuration="3.190510022s" podCreationTimestamp="2025-12-16 12:17:54 +0000 UTC" firstStartedPulling="2025-12-16 12:17:54.675844687 +0000 UTC m=+22.726759798" lastFinishedPulling="2025-12-16 12:17:56.359708801 +0000 UTC m=+24.410623912" observedRunningTime="2025-12-16 12:17:57.190390261 +0000 UTC m=+25.241305332" watchObservedRunningTime="2025-12-16 12:17:57.190510022 +0000 UTC m=+25.241425133" Dec 16 12:17:57.244805 kubelet[2932]: E1216 12:17:57.244764 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.244805 kubelet[2932]: W1216 12:17:57.244789 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.244805 kubelet[2932]: E1216 12:17:57.244832 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.244986 kubelet[2932]: E1216 12:17:57.244979 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.245011 kubelet[2932]: W1216 12:17:57.244987 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.245011 kubelet[2932]: E1216 12:17:57.244995 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.245153 kubelet[2932]: E1216 12:17:57.245121 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.245153 kubelet[2932]: W1216 12:17:57.245131 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.245153 kubelet[2932]: E1216 12:17:57.245140 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.245280 kubelet[2932]: E1216 12:17:57.245270 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.245280 kubelet[2932]: W1216 12:17:57.245279 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.245331 kubelet[2932]: E1216 12:17:57.245287 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.245428 kubelet[2932]: E1216 12:17:57.245418 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.245428 kubelet[2932]: W1216 12:17:57.245427 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.245473 kubelet[2932]: E1216 12:17:57.245434 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.245557 kubelet[2932]: E1216 12:17:57.245547 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.245583 kubelet[2932]: W1216 12:17:57.245556 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.245583 kubelet[2932]: E1216 12:17:57.245563 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.245699 kubelet[2932]: E1216 12:17:57.245689 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.245724 kubelet[2932]: W1216 12:17:57.245698 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.245724 kubelet[2932]: E1216 12:17:57.245706 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.245847 kubelet[2932]: E1216 12:17:57.245837 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.245878 kubelet[2932]: W1216 12:17:57.245847 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.245878 kubelet[2932]: E1216 12:17:57.245855 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.245989 kubelet[2932]: E1216 12:17:57.245978 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.246011 kubelet[2932]: W1216 12:17:57.245988 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.246011 kubelet[2932]: E1216 12:17:57.245995 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.246182 kubelet[2932]: E1216 12:17:57.246157 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.246182 kubelet[2932]: W1216 12:17:57.246168 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.246182 kubelet[2932]: E1216 12:17:57.246175 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.246316 kubelet[2932]: E1216 12:17:57.246306 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.246316 kubelet[2932]: W1216 12:17:57.246316 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.246358 kubelet[2932]: E1216 12:17:57.246328 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.246455 kubelet[2932]: E1216 12:17:57.246445 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.246480 kubelet[2932]: W1216 12:17:57.246454 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.246480 kubelet[2932]: E1216 12:17:57.246462 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.246598 kubelet[2932]: E1216 12:17:57.246588 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.246620 kubelet[2932]: W1216 12:17:57.246597 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.246620 kubelet[2932]: E1216 12:17:57.246604 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.246732 kubelet[2932]: E1216 12:17:57.246723 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.246732 kubelet[2932]: W1216 12:17:57.246731 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.246778 kubelet[2932]: E1216 12:17:57.246738 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.246917 kubelet[2932]: E1216 12:17:57.246906 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.246938 kubelet[2932]: W1216 12:17:57.246916 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.246938 kubelet[2932]: E1216 12:17:57.246924 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.267525 kubelet[2932]: E1216 12:17:57.267488 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.267525 kubelet[2932]: W1216 12:17:57.267508 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.267525 kubelet[2932]: E1216 12:17:57.267531 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.267784 kubelet[2932]: E1216 12:17:57.267769 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.267784 kubelet[2932]: W1216 12:17:57.267780 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.267861 kubelet[2932]: E1216 12:17:57.267790 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.268019 kubelet[2932]: E1216 12:17:57.268002 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.268065 kubelet[2932]: W1216 12:17:57.268019 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.268065 kubelet[2932]: E1216 12:17:57.268031 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.268199 kubelet[2932]: E1216 12:17:57.268189 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.268199 kubelet[2932]: W1216 12:17:57.268199 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.268257 kubelet[2932]: E1216 12:17:57.268207 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.268342 kubelet[2932]: E1216 12:17:57.268330 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.268342 kubelet[2932]: W1216 12:17:57.268340 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.268408 kubelet[2932]: E1216 12:17:57.268348 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.268506 kubelet[2932]: E1216 12:17:57.268495 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.268539 kubelet[2932]: W1216 12:17:57.268505 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.268539 kubelet[2932]: E1216 12:17:57.268513 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.268798 kubelet[2932]: E1216 12:17:57.268782 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.268902 kubelet[2932]: W1216 12:17:57.268888 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.268971 kubelet[2932]: E1216 12:17:57.268958 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.269213 kubelet[2932]: E1216 12:17:57.269201 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.269402 kubelet[2932]: W1216 12:17:57.269277 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.269402 kubelet[2932]: E1216 12:17:57.269293 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.269538 kubelet[2932]: E1216 12:17:57.269526 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.269596 kubelet[2932]: W1216 12:17:57.269584 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.269650 kubelet[2932]: E1216 12:17:57.269640 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.270114 kubelet[2932]: E1216 12:17:57.270080 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.270114 kubelet[2932]: W1216 12:17:57.270102 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.270114 kubelet[2932]: E1216 12:17:57.270114 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.270316 kubelet[2932]: E1216 12:17:57.270295 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.270365 kubelet[2932]: W1216 12:17:57.270318 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.270365 kubelet[2932]: E1216 12:17:57.270332 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.270478 kubelet[2932]: E1216 12:17:57.270460 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.270478 kubelet[2932]: W1216 12:17:57.270471 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.270527 kubelet[2932]: E1216 12:17:57.270483 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.270634 kubelet[2932]: E1216 12:17:57.270618 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.270675 kubelet[2932]: W1216 12:17:57.270636 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.270675 kubelet[2932]: E1216 12:17:57.270650 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.273529 kubelet[2932]: E1216 12:17:57.273490 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.273653 kubelet[2932]: W1216 12:17:57.273614 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.273976 kubelet[2932]: E1216 12:17:57.273853 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.274095 kubelet[2932]: E1216 12:17:57.274083 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.274169 kubelet[2932]: W1216 12:17:57.274157 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.274223 kubelet[2932]: E1216 12:17:57.274213 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.274553 kubelet[2932]: E1216 12:17:57.274420 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.274553 kubelet[2932]: W1216 12:17:57.274430 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.274553 kubelet[2932]: E1216 12:17:57.274440 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.274707 kubelet[2932]: E1216 12:17:57.274696 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.274766 kubelet[2932]: W1216 12:17:57.274755 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.274841 kubelet[2932]: E1216 12:17:57.274804 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.275318 kubelet[2932]: E1216 12:17:57.275304 2932 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:17:57.275404 kubelet[2932]: W1216 12:17:57.275391 2932 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:17:57.275466 kubelet[2932]: E1216 12:17:57.275456 2932 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:17:57.602168 containerd[1662]: time="2025-12-16T12:17:57.601954260Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:57.603855 containerd[1662]: time="2025-12-16T12:17:57.603767585Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Dec 16 12:17:57.605688 containerd[1662]: time="2025-12-16T12:17:57.605655591Z" level=info msg="ImageCreate event name:\"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:57.608706 containerd[1662]: time="2025-12-16T12:17:57.608669721Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:17:57.609676 containerd[1662]: time="2025-12-16T12:17:57.609642204Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5636392\" in 1.249686443s" Dec 16 12:17:57.609676 containerd[1662]: time="2025-12-16T12:17:57.609678444Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\"" Dec 16 12:17:57.615087 containerd[1662]: time="2025-12-16T12:17:57.615050661Z" level=info msg="CreateContainer within sandbox \"4642879d07dc39ea137bdc2c3f8993ff6e427fe9c17da3a821e2fd40781df02a\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Dec 16 12:17:57.627503 containerd[1662]: time="2025-12-16T12:17:57.626276937Z" level=info msg="Container 4cbf69d3420fa828dc1183ef404e599a3e78ce85a87b9fa6addb6f402ae4e6bf: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:17:57.629547 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount610193133.mount: Deactivated successfully. Dec 16 12:17:57.637932 containerd[1662]: time="2025-12-16T12:17:57.637892055Z" level=info msg="CreateContainer within sandbox \"4642879d07dc39ea137bdc2c3f8993ff6e427fe9c17da3a821e2fd40781df02a\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"4cbf69d3420fa828dc1183ef404e599a3e78ce85a87b9fa6addb6f402ae4e6bf\"" Dec 16 12:17:57.638448 containerd[1662]: time="2025-12-16T12:17:57.638352856Z" level=info msg="StartContainer for \"4cbf69d3420fa828dc1183ef404e599a3e78ce85a87b9fa6addb6f402ae4e6bf\"" Dec 16 12:17:57.642177 containerd[1662]: time="2025-12-16T12:17:57.642127348Z" level=info msg="connecting to shim 4cbf69d3420fa828dc1183ef404e599a3e78ce85a87b9fa6addb6f402ae4e6bf" address="unix:///run/containerd/s/0ac6c768bdd4270cb5a4414d39a56532d892357106adac15928f12a56512b775" protocol=ttrpc version=3 Dec 16 12:17:57.665005 systemd[1]: Started cri-containerd-4cbf69d3420fa828dc1183ef404e599a3e78ce85a87b9fa6addb6f402ae4e6bf.scope - libcontainer container 4cbf69d3420fa828dc1183ef404e599a3e78ce85a87b9fa6addb6f402ae4e6bf. Dec 16 12:17:57.704000 audit: BPF prog-id=166 op=LOAD Dec 16 12:17:57.706978 kernel: kauditd_printk_skb: 80 callbacks suppressed Dec 16 12:17:57.707006 kernel: audit: type=1334 audit(1765887477.704:560): prog-id=166 op=LOAD Dec 16 12:17:57.704000 audit[3643]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3491 pid=3643 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:57.710638 kernel: audit: type=1300 audit(1765887477.704:560): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3491 pid=3643 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:57.704000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463626636396433343230666138323864633131383365663430346535 Dec 16 12:17:57.714842 kernel: audit: type=1327 audit(1765887477.704:560): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463626636396433343230666138323864633131383365663430346535 Dec 16 12:17:57.704000 audit: BPF prog-id=167 op=LOAD Dec 16 12:17:57.715835 kernel: audit: type=1334 audit(1765887477.704:561): prog-id=167 op=LOAD Dec 16 12:17:57.715897 kernel: audit: type=1300 audit(1765887477.704:561): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3491 pid=3643 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:57.704000 audit[3643]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3491 pid=3643 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:57.704000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463626636396433343230666138323864633131383365663430346535 Dec 16 12:17:57.722114 kernel: audit: type=1327 audit(1765887477.704:561): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463626636396433343230666138323864633131383365663430346535 Dec 16 12:17:57.722157 kernel: audit: type=1334 audit(1765887477.705:562): prog-id=167 op=UNLOAD Dec 16 12:17:57.705000 audit: BPF prog-id=167 op=UNLOAD Dec 16 12:17:57.705000 audit[3643]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3491 pid=3643 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:57.726583 kernel: audit: type=1300 audit(1765887477.705:562): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3491 pid=3643 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:57.705000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463626636396433343230666138323864633131383365663430346535 Dec 16 12:17:57.729819 kernel: audit: type=1327 audit(1765887477.705:562): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463626636396433343230666138323864633131383365663430346535 Dec 16 12:17:57.705000 audit: BPF prog-id=166 op=UNLOAD Dec 16 12:17:57.730934 kernel: audit: type=1334 audit(1765887477.705:563): prog-id=166 op=UNLOAD Dec 16 12:17:57.705000 audit[3643]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3491 pid=3643 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:57.705000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463626636396433343230666138323864633131383365663430346535 Dec 16 12:17:57.705000 audit: BPF prog-id=168 op=LOAD Dec 16 12:17:57.705000 audit[3643]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=3491 pid=3643 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:57.705000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3463626636396433343230666138323864633131383365663430346535 Dec 16 12:17:57.744664 containerd[1662]: time="2025-12-16T12:17:57.744609756Z" level=info msg="StartContainer for \"4cbf69d3420fa828dc1183ef404e599a3e78ce85a87b9fa6addb6f402ae4e6bf\" returns successfully" Dec 16 12:17:57.794417 systemd[1]: cri-containerd-4cbf69d3420fa828dc1183ef404e599a3e78ce85a87b9fa6addb6f402ae4e6bf.scope: Deactivated successfully. Dec 16 12:17:57.796000 audit: BPF prog-id=168 op=UNLOAD Dec 16 12:17:57.798783 containerd[1662]: time="2025-12-16T12:17:57.798748210Z" level=info msg="received container exit event container_id:\"4cbf69d3420fa828dc1183ef404e599a3e78ce85a87b9fa6addb6f402ae4e6bf\" id:\"4cbf69d3420fa828dc1183ef404e599a3e78ce85a87b9fa6addb6f402ae4e6bf\" pid:3656 exited_at:{seconds:1765887477 nanos:798380009}" Dec 16 12:17:57.820213 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4cbf69d3420fa828dc1183ef404e599a3e78ce85a87b9fa6addb6f402ae4e6bf-rootfs.mount: Deactivated successfully. Dec 16 12:17:58.109611 kubelet[2932]: E1216 12:17:58.109551 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tp52x" podUID="1d6077b5-49a5-4d21-bd6b-0ffae41c4da0" Dec 16 12:17:58.183241 kubelet[2932]: I1216 12:17:58.183127 2932 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 12:18:00.111394 kubelet[2932]: E1216 12:18:00.111351 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tp52x" podUID="1d6077b5-49a5-4d21-bd6b-0ffae41c4da0" Dec 16 12:18:01.192485 containerd[1662]: time="2025-12-16T12:18:01.192441600Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Dec 16 12:18:02.110277 kubelet[2932]: E1216 12:18:02.110135 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tp52x" podUID="1d6077b5-49a5-4d21-bd6b-0ffae41c4da0" Dec 16 12:18:03.633104 containerd[1662]: time="2025-12-16T12:18:03.632894217Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:18:03.635041 containerd[1662]: time="2025-12-16T12:18:03.634997863Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=65921248" Dec 16 12:18:03.635995 containerd[1662]: time="2025-12-16T12:18:03.635967867Z" level=info msg="ImageCreate event name:\"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:18:03.639081 containerd[1662]: time="2025-12-16T12:18:03.639031756Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:18:03.639836 containerd[1662]: time="2025-12-16T12:18:03.639560878Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"67295507\" in 2.447078078s" Dec 16 12:18:03.639836 containerd[1662]: time="2025-12-16T12:18:03.639609478Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\"" Dec 16 12:18:03.643831 containerd[1662]: time="2025-12-16T12:18:03.643687411Z" level=info msg="CreateContainer within sandbox \"4642879d07dc39ea137bdc2c3f8993ff6e427fe9c17da3a821e2fd40781df02a\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Dec 16 12:18:03.651347 containerd[1662]: time="2025-12-16T12:18:03.651315196Z" level=info msg="Container 2ffaf32e985b3ea6a67c0d234763a4377effdb96c71854a71893737bfb5d0f4f: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:18:03.659314 containerd[1662]: time="2025-12-16T12:18:03.659277021Z" level=info msg="CreateContainer within sandbox \"4642879d07dc39ea137bdc2c3f8993ff6e427fe9c17da3a821e2fd40781df02a\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"2ffaf32e985b3ea6a67c0d234763a4377effdb96c71854a71893737bfb5d0f4f\"" Dec 16 12:18:03.661089 containerd[1662]: time="2025-12-16T12:18:03.660947707Z" level=info msg="StartContainer for \"2ffaf32e985b3ea6a67c0d234763a4377effdb96c71854a71893737bfb5d0f4f\"" Dec 16 12:18:03.662627 containerd[1662]: time="2025-12-16T12:18:03.662601552Z" level=info msg="connecting to shim 2ffaf32e985b3ea6a67c0d234763a4377effdb96c71854a71893737bfb5d0f4f" address="unix:///run/containerd/s/0ac6c768bdd4270cb5a4414d39a56532d892357106adac15928f12a56512b775" protocol=ttrpc version=3 Dec 16 12:18:03.683012 systemd[1]: Started cri-containerd-2ffaf32e985b3ea6a67c0d234763a4377effdb96c71854a71893737bfb5d0f4f.scope - libcontainer container 2ffaf32e985b3ea6a67c0d234763a4377effdb96c71854a71893737bfb5d0f4f. Dec 16 12:18:03.729000 audit: BPF prog-id=169 op=LOAD Dec 16 12:18:03.732349 kernel: kauditd_printk_skb: 6 callbacks suppressed Dec 16 12:18:03.732405 kernel: audit: type=1334 audit(1765887483.729:566): prog-id=169 op=LOAD Dec 16 12:18:03.729000 audit[3706]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001b03e8 a2=98 a3=0 items=0 ppid=3491 pid=3706 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:03.736059 kernel: audit: type=1300 audit(1765887483.729:566): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001b03e8 a2=98 a3=0 items=0 ppid=3491 pid=3706 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:03.736185 kernel: audit: type=1327 audit(1765887483.729:566): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266666166333265393835623365613661363763306432333437363361 Dec 16 12:18:03.729000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266666166333265393835623365613661363763306432333437363361 Dec 16 12:18:03.729000 audit: BPF prog-id=170 op=LOAD Dec 16 12:18:03.739847 kernel: audit: type=1334 audit(1765887483.729:567): prog-id=170 op=LOAD Dec 16 12:18:03.729000 audit[3706]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=40001b0168 a2=98 a3=0 items=0 ppid=3491 pid=3706 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:03.743427 kernel: audit: type=1300 audit(1765887483.729:567): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=40001b0168 a2=98 a3=0 items=0 ppid=3491 pid=3706 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:03.729000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266666166333265393835623365613661363763306432333437363361 Dec 16 12:18:03.746703 kernel: audit: type=1327 audit(1765887483.729:567): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266666166333265393835623365613661363763306432333437363361 Dec 16 12:18:03.746783 kernel: audit: type=1334 audit(1765887483.731:568): prog-id=170 op=UNLOAD Dec 16 12:18:03.731000 audit: BPF prog-id=170 op=UNLOAD Dec 16 12:18:03.731000 audit[3706]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3491 pid=3706 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:03.750729 kernel: audit: type=1300 audit(1765887483.731:568): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3491 pid=3706 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:03.731000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266666166333265393835623365613661363763306432333437363361 Dec 16 12:18:03.754862 kernel: audit: type=1327 audit(1765887483.731:568): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266666166333265393835623365613661363763306432333437363361 Dec 16 12:18:03.754929 kernel: audit: type=1334 audit(1765887483.731:569): prog-id=169 op=UNLOAD Dec 16 12:18:03.731000 audit: BPF prog-id=169 op=UNLOAD Dec 16 12:18:03.731000 audit[3706]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3491 pid=3706 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:03.731000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266666166333265393835623365613661363763306432333437363361 Dec 16 12:18:03.731000 audit: BPF prog-id=171 op=LOAD Dec 16 12:18:03.731000 audit[3706]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001b0648 a2=98 a3=0 items=0 ppid=3491 pid=3706 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:03.731000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3266666166333265393835623365613661363763306432333437363361 Dec 16 12:18:03.767720 containerd[1662]: time="2025-12-16T12:18:03.767678208Z" level=info msg="StartContainer for \"2ffaf32e985b3ea6a67c0d234763a4377effdb96c71854a71893737bfb5d0f4f\" returns successfully" Dec 16 12:18:04.110740 kubelet[2932]: E1216 12:18:04.110585 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-tp52x" podUID="1d6077b5-49a5-4d21-bd6b-0ffae41c4da0" Dec 16 12:18:05.024395 containerd[1662]: time="2025-12-16T12:18:05.024347634Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 16 12:18:05.026466 systemd[1]: cri-containerd-2ffaf32e985b3ea6a67c0d234763a4377effdb96c71854a71893737bfb5d0f4f.scope: Deactivated successfully. Dec 16 12:18:05.027119 systemd[1]: cri-containerd-2ffaf32e985b3ea6a67c0d234763a4377effdb96c71854a71893737bfb5d0f4f.scope: Consumed 458ms CPU time, 185.2M memory peak, 165.9M written to disk. Dec 16 12:18:05.028374 containerd[1662]: time="2025-12-16T12:18:05.028308846Z" level=info msg="received container exit event container_id:\"2ffaf32e985b3ea6a67c0d234763a4377effdb96c71854a71893737bfb5d0f4f\" id:\"2ffaf32e985b3ea6a67c0d234763a4377effdb96c71854a71893737bfb5d0f4f\" pid:3718 exited_at:{seconds:1765887485 nanos:27641324}" Dec 16 12:18:05.031000 audit: BPF prog-id=171 op=UNLOAD Dec 16 12:18:05.034503 kubelet[2932]: I1216 12:18:05.034478 2932 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Dec 16 12:18:05.055970 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-2ffaf32e985b3ea6a67c0d234763a4377effdb96c71854a71893737bfb5d0f4f-rootfs.mount: Deactivated successfully. Dec 16 12:18:06.358823 systemd[1]: Created slice kubepods-besteffort-pod01d79110_1504_420e_beb4_8a0f58f3b458.slice - libcontainer container kubepods-besteffort-pod01d79110_1504_420e_beb4_8a0f58f3b458.slice. Dec 16 12:18:06.385099 systemd[1]: Created slice kubepods-besteffort-pod1d6077b5_49a5_4d21_bd6b_0ffae41c4da0.slice - libcontainer container kubepods-besteffort-pod1d6077b5_49a5_4d21_bd6b_0ffae41c4da0.slice. Dec 16 12:18:06.390588 systemd[1]: Created slice kubepods-burstable-podba1fed23_827f_4247_97a8_f8bd8998b8f8.slice - libcontainer container kubepods-burstable-podba1fed23_827f_4247_97a8_f8bd8998b8f8.slice. Dec 16 12:18:06.433650 kubelet[2932]: I1216 12:18:06.433539 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6v48h\" (UniqueName: \"kubernetes.io/projected/ba1fed23-827f-4247-97a8-f8bd8998b8f8-kube-api-access-6v48h\") pod \"coredns-66bc5c9577-x2qrf\" (UID: \"ba1fed23-827f-4247-97a8-f8bd8998b8f8\") " pod="kube-system/coredns-66bc5c9577-x2qrf" Dec 16 12:18:06.433650 kubelet[2932]: I1216 12:18:06.433637 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/01d79110-1504-420e-beb4-8a0f58f3b458-tigera-ca-bundle\") pod \"calico-kube-controllers-7db869f9-8vr6z\" (UID: \"01d79110-1504-420e-beb4-8a0f58f3b458\") " pod="calico-system/calico-kube-controllers-7db869f9-8vr6z" Dec 16 12:18:06.434065 kubelet[2932]: I1216 12:18:06.433696 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-php4p\" (UniqueName: \"kubernetes.io/projected/01d79110-1504-420e-beb4-8a0f58f3b458-kube-api-access-php4p\") pod \"calico-kube-controllers-7db869f9-8vr6z\" (UID: \"01d79110-1504-420e-beb4-8a0f58f3b458\") " pod="calico-system/calico-kube-controllers-7db869f9-8vr6z" Dec 16 12:18:06.434065 kubelet[2932]: I1216 12:18:06.433745 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/ba1fed23-827f-4247-97a8-f8bd8998b8f8-config-volume\") pod \"coredns-66bc5c9577-x2qrf\" (UID: \"ba1fed23-827f-4247-97a8-f8bd8998b8f8\") " pod="kube-system/coredns-66bc5c9577-x2qrf" Dec 16 12:18:06.478742 containerd[1662]: time="2025-12-16T12:18:06.478700492Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tp52x,Uid:1d6077b5-49a5-4d21-bd6b-0ffae41c4da0,Namespace:calico-system,Attempt:0,}" Dec 16 12:18:06.499076 systemd[1]: Created slice kubepods-besteffort-podc36f8480_80e5_4b1d_b264_0fc54133bb26.slice - libcontainer container kubepods-besteffort-podc36f8480_80e5_4b1d_b264_0fc54133bb26.slice. Dec 16 12:18:06.509322 systemd[1]: Created slice kubepods-burstable-pod7d93cb67_a590_4cec_b00c_d1ee5673d5a2.slice - libcontainer container kubepods-burstable-pod7d93cb67_a590_4cec_b00c_d1ee5673d5a2.slice. Dec 16 12:18:06.516246 systemd[1]: Created slice kubepods-besteffort-pod928c4f14_bc1a_4920_8219_21947daefaad.slice - libcontainer container kubepods-besteffort-pod928c4f14_bc1a_4920_8219_21947daefaad.slice. Dec 16 12:18:06.526031 systemd[1]: Created slice kubepods-besteffort-podf42d8c07_697c_492a_826b_630e49b3282d.slice - libcontainer container kubepods-besteffort-podf42d8c07_697c_492a_826b_630e49b3282d.slice. Dec 16 12:18:06.531442 systemd[1]: Created slice kubepods-besteffort-pod963a1289_7bdf_4dde_a21a_b1e5da9ecfcc.slice - libcontainer container kubepods-besteffort-pod963a1289_7bdf_4dde_a21a_b1e5da9ecfcc.slice. Dec 16 12:18:06.534848 kubelet[2932]: I1216 12:18:06.534550 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ng6px\" (UniqueName: \"kubernetes.io/projected/963a1289-7bdf-4dde-a21a-b1e5da9ecfcc-kube-api-access-ng6px\") pod \"calico-apiserver-5d558c8677-8zlvg\" (UID: \"963a1289-7bdf-4dde-a21a-b1e5da9ecfcc\") " pod="calico-apiserver/calico-apiserver-5d558c8677-8zlvg" Dec 16 12:18:06.534848 kubelet[2932]: I1216 12:18:06.534619 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/928c4f14-bc1a-4920-8219-21947daefaad-whisker-ca-bundle\") pod \"whisker-5d697c444f-gtc9w\" (UID: \"928c4f14-bc1a-4920-8219-21947daefaad\") " pod="calico-system/whisker-5d697c444f-gtc9w" Dec 16 12:18:06.534848 kubelet[2932]: I1216 12:18:06.534750 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/963a1289-7bdf-4dde-a21a-b1e5da9ecfcc-calico-apiserver-certs\") pod \"calico-apiserver-5d558c8677-8zlvg\" (UID: \"963a1289-7bdf-4dde-a21a-b1e5da9ecfcc\") " pod="calico-apiserver/calico-apiserver-5d558c8677-8zlvg" Dec 16 12:18:06.535950 kubelet[2932]: I1216 12:18:06.535892 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q7zjs\" (UniqueName: \"kubernetes.io/projected/c36f8480-80e5-4b1d-b264-0fc54133bb26-kube-api-access-q7zjs\") pod \"calico-apiserver-5d558c8677-khwzf\" (UID: \"c36f8480-80e5-4b1d-b264-0fc54133bb26\") " pod="calico-apiserver/calico-apiserver-5d558c8677-khwzf" Dec 16 12:18:06.536029 kubelet[2932]: I1216 12:18:06.536004 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5qgz\" (UniqueName: \"kubernetes.io/projected/f42d8c07-697c-492a-826b-630e49b3282d-kube-api-access-t5qgz\") pod \"goldmane-7c778bb748-xvgpm\" (UID: \"f42d8c07-697c-492a-826b-630e49b3282d\") " pod="calico-system/goldmane-7c778bb748-xvgpm" Dec 16 12:18:06.536296 kubelet[2932]: I1216 12:18:06.536057 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/c36f8480-80e5-4b1d-b264-0fc54133bb26-calico-apiserver-certs\") pod \"calico-apiserver-5d558c8677-khwzf\" (UID: \"c36f8480-80e5-4b1d-b264-0fc54133bb26\") " pod="calico-apiserver/calico-apiserver-5d558c8677-khwzf" Dec 16 12:18:06.536296 kubelet[2932]: I1216 12:18:06.536102 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bql66\" (UniqueName: \"kubernetes.io/projected/7d93cb67-a590-4cec-b00c-d1ee5673d5a2-kube-api-access-bql66\") pod \"coredns-66bc5c9577-hvtrp\" (UID: \"7d93cb67-a590-4cec-b00c-d1ee5673d5a2\") " pod="kube-system/coredns-66bc5c9577-hvtrp" Dec 16 12:18:06.536296 kubelet[2932]: I1216 12:18:06.536159 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/928c4f14-bc1a-4920-8219-21947daefaad-whisker-backend-key-pair\") pod \"whisker-5d697c444f-gtc9w\" (UID: \"928c4f14-bc1a-4920-8219-21947daefaad\") " pod="calico-system/whisker-5d697c444f-gtc9w" Dec 16 12:18:06.536296 kubelet[2932]: I1216 12:18:06.536178 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h9ps8\" (UniqueName: \"kubernetes.io/projected/928c4f14-bc1a-4920-8219-21947daefaad-kube-api-access-h9ps8\") pod \"whisker-5d697c444f-gtc9w\" (UID: \"928c4f14-bc1a-4920-8219-21947daefaad\") " pod="calico-system/whisker-5d697c444f-gtc9w" Dec 16 12:18:06.536296 kubelet[2932]: I1216 12:18:06.536198 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f42d8c07-697c-492a-826b-630e49b3282d-goldmane-ca-bundle\") pod \"goldmane-7c778bb748-xvgpm\" (UID: \"f42d8c07-697c-492a-826b-630e49b3282d\") " pod="calico-system/goldmane-7c778bb748-xvgpm" Dec 16 12:18:06.536470 kubelet[2932]: I1216 12:18:06.536216 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7d93cb67-a590-4cec-b00c-d1ee5673d5a2-config-volume\") pod \"coredns-66bc5c9577-hvtrp\" (UID: \"7d93cb67-a590-4cec-b00c-d1ee5673d5a2\") " pod="kube-system/coredns-66bc5c9577-hvtrp" Dec 16 12:18:06.536470 kubelet[2932]: I1216 12:18:06.536234 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f42d8c07-697c-492a-826b-630e49b3282d-config\") pod \"goldmane-7c778bb748-xvgpm\" (UID: \"f42d8c07-697c-492a-826b-630e49b3282d\") " pod="calico-system/goldmane-7c778bb748-xvgpm" Dec 16 12:18:06.536470 kubelet[2932]: I1216 12:18:06.536248 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/f42d8c07-697c-492a-826b-630e49b3282d-goldmane-key-pair\") pod \"goldmane-7c778bb748-xvgpm\" (UID: \"f42d8c07-697c-492a-826b-630e49b3282d\") " pod="calico-system/goldmane-7c778bb748-xvgpm" Dec 16 12:18:06.575539 containerd[1662]: time="2025-12-16T12:18:06.575461162Z" level=error msg="Failed to destroy network for sandbox \"02d77e8a718b41be05713fec3369f22a153b650c9ce6056b14da1cef3d0b9d8a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:18:06.577631 systemd[1]: run-netns-cni\x2d39e88c75\x2d6fbb\x2d1cf8\x2d9d66\x2d19bacc9e213b.mount: Deactivated successfully. Dec 16 12:18:06.579187 containerd[1662]: time="2025-12-16T12:18:06.579079893Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tp52x,Uid:1d6077b5-49a5-4d21-bd6b-0ffae41c4da0,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"02d77e8a718b41be05713fec3369f22a153b650c9ce6056b14da1cef3d0b9d8a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:18:06.579364 kubelet[2932]: E1216 12:18:06.579321 2932 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"02d77e8a718b41be05713fec3369f22a153b650c9ce6056b14da1cef3d0b9d8a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:18:06.579414 kubelet[2932]: E1216 12:18:06.579392 2932 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"02d77e8a718b41be05713fec3369f22a153b650c9ce6056b14da1cef3d0b9d8a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-tp52x" Dec 16 12:18:06.579449 kubelet[2932]: E1216 12:18:06.579412 2932 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"02d77e8a718b41be05713fec3369f22a153b650c9ce6056b14da1cef3d0b9d8a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-tp52x" Dec 16 12:18:06.579486 kubelet[2932]: E1216 12:18:06.579459 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-tp52x_calico-system(1d6077b5-49a5-4d21-bd6b-0ffae41c4da0)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-tp52x_calico-system(1d6077b5-49a5-4d21-bd6b-0ffae41c4da0)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"02d77e8a718b41be05713fec3369f22a153b650c9ce6056b14da1cef3d0b9d8a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-tp52x" podUID="1d6077b5-49a5-4d21-bd6b-0ffae41c4da0" Dec 16 12:18:06.672090 containerd[1662]: time="2025-12-16T12:18:06.672035631Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7db869f9-8vr6z,Uid:01d79110-1504-420e-beb4-8a0f58f3b458,Namespace:calico-system,Attempt:0,}" Dec 16 12:18:06.696074 containerd[1662]: time="2025-12-16T12:18:06.696034668Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-x2qrf,Uid:ba1fed23-827f-4247-97a8-f8bd8998b8f8,Namespace:kube-system,Attempt:0,}" Dec 16 12:18:06.719225 containerd[1662]: time="2025-12-16T12:18:06.718988301Z" level=error msg="Failed to destroy network for sandbox \"4ba6ec2d535a2792abcbc86697c55a639c38f1af3cfc9f5ac0a40d93b7311a88\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:18:06.721279 containerd[1662]: time="2025-12-16T12:18:06.721231989Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7db869f9-8vr6z,Uid:01d79110-1504-420e-beb4-8a0f58f3b458,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ba6ec2d535a2792abcbc86697c55a639c38f1af3cfc9f5ac0a40d93b7311a88\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:18:06.721554 kubelet[2932]: E1216 12:18:06.721510 2932 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ba6ec2d535a2792abcbc86697c55a639c38f1af3cfc9f5ac0a40d93b7311a88\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:18:06.721602 kubelet[2932]: E1216 12:18:06.721577 2932 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ba6ec2d535a2792abcbc86697c55a639c38f1af3cfc9f5ac0a40d93b7311a88\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7db869f9-8vr6z" Dec 16 12:18:06.721602 kubelet[2932]: E1216 12:18:06.721595 2932 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"4ba6ec2d535a2792abcbc86697c55a639c38f1af3cfc9f5ac0a40d93b7311a88\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-7db869f9-8vr6z" Dec 16 12:18:06.721671 kubelet[2932]: E1216 12:18:06.721645 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-7db869f9-8vr6z_calico-system(01d79110-1504-420e-beb4-8a0f58f3b458)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-7db869f9-8vr6z_calico-system(01d79110-1504-420e-beb4-8a0f58f3b458)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"4ba6ec2d535a2792abcbc86697c55a639c38f1af3cfc9f5ac0a40d93b7311a88\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-7db869f9-8vr6z" podUID="01d79110-1504-420e-beb4-8a0f58f3b458" Dec 16 12:18:06.744129 containerd[1662]: time="2025-12-16T12:18:06.744069062Z" level=error msg="Failed to destroy network for sandbox \"7ef0bc23579847f1ad9b41ad745d04ef01979e64dd3e4f3166433ac05039cb11\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:18:06.746425 containerd[1662]: time="2025-12-16T12:18:06.746382069Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-x2qrf,Uid:ba1fed23-827f-4247-97a8-f8bd8998b8f8,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7ef0bc23579847f1ad9b41ad745d04ef01979e64dd3e4f3166433ac05039cb11\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:18:06.746643 kubelet[2932]: E1216 12:18:06.746607 2932 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7ef0bc23579847f1ad9b41ad745d04ef01979e64dd3e4f3166433ac05039cb11\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:18:06.746696 kubelet[2932]: E1216 12:18:06.746666 2932 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7ef0bc23579847f1ad9b41ad745d04ef01979e64dd3e4f3166433ac05039cb11\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-x2qrf" Dec 16 12:18:06.746696 kubelet[2932]: E1216 12:18:06.746685 2932 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7ef0bc23579847f1ad9b41ad745d04ef01979e64dd3e4f3166433ac05039cb11\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-x2qrf" Dec 16 12:18:06.746774 kubelet[2932]: E1216 12:18:06.746737 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-x2qrf_kube-system(ba1fed23-827f-4247-97a8-f8bd8998b8f8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-x2qrf_kube-system(ba1fed23-827f-4247-97a8-f8bd8998b8f8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7ef0bc23579847f1ad9b41ad745d04ef01979e64dd3e4f3166433ac05039cb11\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-x2qrf" podUID="ba1fed23-827f-4247-97a8-f8bd8998b8f8" Dec 16 12:18:06.807927 containerd[1662]: time="2025-12-16T12:18:06.807804546Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d558c8677-khwzf,Uid:c36f8480-80e5-4b1d-b264-0fc54133bb26,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:18:06.816007 containerd[1662]: time="2025-12-16T12:18:06.815975932Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-hvtrp,Uid:7d93cb67-a590-4cec-b00c-d1ee5673d5a2,Namespace:kube-system,Attempt:0,}" Dec 16 12:18:06.826657 containerd[1662]: time="2025-12-16T12:18:06.826494006Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5d697c444f-gtc9w,Uid:928c4f14-bc1a-4920-8219-21947daefaad,Namespace:calico-system,Attempt:0,}" Dec 16 12:18:06.833562 containerd[1662]: time="2025-12-16T12:18:06.833519228Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-xvgpm,Uid:f42d8c07-697c-492a-826b-630e49b3282d,Namespace:calico-system,Attempt:0,}" Dec 16 12:18:06.838272 containerd[1662]: time="2025-12-16T12:18:06.838239643Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d558c8677-8zlvg,Uid:963a1289-7bdf-4dde-a21a-b1e5da9ecfcc,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:18:06.869978 containerd[1662]: time="2025-12-16T12:18:06.869928065Z" level=error msg="Failed to destroy network for sandbox \"45e345d1e5bd139a5c9759e6d96bcfd35b3bc595058fb85982c4e8b7a695f143\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:18:06.874325 containerd[1662]: time="2025-12-16T12:18:06.874159918Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d558c8677-khwzf,Uid:c36f8480-80e5-4b1d-b264-0fc54133bb26,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"45e345d1e5bd139a5c9759e6d96bcfd35b3bc595058fb85982c4e8b7a695f143\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:18:06.874516 kubelet[2932]: E1216 12:18:06.874398 2932 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"45e345d1e5bd139a5c9759e6d96bcfd35b3bc595058fb85982c4e8b7a695f143\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:18:06.874516 kubelet[2932]: E1216 12:18:06.874447 2932 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"45e345d1e5bd139a5c9759e6d96bcfd35b3bc595058fb85982c4e8b7a695f143\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d558c8677-khwzf" Dec 16 12:18:06.874516 kubelet[2932]: E1216 12:18:06.874462 2932 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"45e345d1e5bd139a5c9759e6d96bcfd35b3bc595058fb85982c4e8b7a695f143\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d558c8677-khwzf" Dec 16 12:18:06.874677 kubelet[2932]: E1216 12:18:06.874516 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d558c8677-khwzf_calico-apiserver(c36f8480-80e5-4b1d-b264-0fc54133bb26)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d558c8677-khwzf_calico-apiserver(c36f8480-80e5-4b1d-b264-0fc54133bb26)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"45e345d1e5bd139a5c9759e6d96bcfd35b3bc595058fb85982c4e8b7a695f143\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d558c8677-khwzf" podUID="c36f8480-80e5-4b1d-b264-0fc54133bb26" Dec 16 12:18:06.888993 containerd[1662]: time="2025-12-16T12:18:06.888947886Z" level=error msg="Failed to destroy network for sandbox \"301523ec067af672526a10b1cc32f609c4a810e5245a1b89df0a6981de5d7590\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:18:06.897994 containerd[1662]: time="2025-12-16T12:18:06.897898955Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-hvtrp,Uid:7d93cb67-a590-4cec-b00c-d1ee5673d5a2,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"301523ec067af672526a10b1cc32f609c4a810e5245a1b89df0a6981de5d7590\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:18:06.898294 kubelet[2932]: E1216 12:18:06.898175 2932 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"301523ec067af672526a10b1cc32f609c4a810e5245a1b89df0a6981de5d7590\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:18:06.898294 kubelet[2932]: E1216 12:18:06.898230 2932 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"301523ec067af672526a10b1cc32f609c4a810e5245a1b89df0a6981de5d7590\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-hvtrp" Dec 16 12:18:06.898294 kubelet[2932]: E1216 12:18:06.898249 2932 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"301523ec067af672526a10b1cc32f609c4a810e5245a1b89df0a6981de5d7590\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-hvtrp" Dec 16 12:18:06.898390 kubelet[2932]: E1216 12:18:06.898295 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-hvtrp_kube-system(7d93cb67-a590-4cec-b00c-d1ee5673d5a2)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-hvtrp_kube-system(7d93cb67-a590-4cec-b00c-d1ee5673d5a2)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"301523ec067af672526a10b1cc32f609c4a810e5245a1b89df0a6981de5d7590\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-hvtrp" podUID="7d93cb67-a590-4cec-b00c-d1ee5673d5a2" Dec 16 12:18:06.903336 containerd[1662]: time="2025-12-16T12:18:06.903214492Z" level=error msg="Failed to destroy network for sandbox \"90e5c47e88a2143625d8bffb17305334648e22110afdc3088bc7e1f87f8d9a24\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:18:06.905446 containerd[1662]: time="2025-12-16T12:18:06.905320218Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5d697c444f-gtc9w,Uid:928c4f14-bc1a-4920-8219-21947daefaad,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"90e5c47e88a2143625d8bffb17305334648e22110afdc3088bc7e1f87f8d9a24\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:18:06.905557 kubelet[2932]: E1216 12:18:06.905514 2932 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"90e5c47e88a2143625d8bffb17305334648e22110afdc3088bc7e1f87f8d9a24\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:18:06.905624 kubelet[2932]: E1216 12:18:06.905560 2932 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"90e5c47e88a2143625d8bffb17305334648e22110afdc3088bc7e1f87f8d9a24\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5d697c444f-gtc9w" Dec 16 12:18:06.905624 kubelet[2932]: E1216 12:18:06.905578 2932 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"90e5c47e88a2143625d8bffb17305334648e22110afdc3088bc7e1f87f8d9a24\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5d697c444f-gtc9w" Dec 16 12:18:06.905729 kubelet[2932]: E1216 12:18:06.905620 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5d697c444f-gtc9w_calico-system(928c4f14-bc1a-4920-8219-21947daefaad)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5d697c444f-gtc9w_calico-system(928c4f14-bc1a-4920-8219-21947daefaad)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"90e5c47e88a2143625d8bffb17305334648e22110afdc3088bc7e1f87f8d9a24\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5d697c444f-gtc9w" podUID="928c4f14-bc1a-4920-8219-21947daefaad" Dec 16 12:18:06.913962 containerd[1662]: time="2025-12-16T12:18:06.913904846Z" level=error msg="Failed to destroy network for sandbox \"c0902a255ff545f1fc32bf0b7030b6630618a2b24d13aa91fbc3f7e8eac574e1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:18:06.916335 containerd[1662]: time="2025-12-16T12:18:06.916295053Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-xvgpm,Uid:f42d8c07-697c-492a-826b-630e49b3282d,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c0902a255ff545f1fc32bf0b7030b6630618a2b24d13aa91fbc3f7e8eac574e1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:18:06.916747 kubelet[2932]: E1216 12:18:06.916689 2932 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c0902a255ff545f1fc32bf0b7030b6630618a2b24d13aa91fbc3f7e8eac574e1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:18:06.916837 kubelet[2932]: E1216 12:18:06.916768 2932 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c0902a255ff545f1fc32bf0b7030b6630618a2b24d13aa91fbc3f7e8eac574e1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-xvgpm" Dec 16 12:18:06.916837 kubelet[2932]: E1216 12:18:06.916788 2932 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c0902a255ff545f1fc32bf0b7030b6630618a2b24d13aa91fbc3f7e8eac574e1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-xvgpm" Dec 16 12:18:06.916895 kubelet[2932]: E1216 12:18:06.916865 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7c778bb748-xvgpm_calico-system(f42d8c07-697c-492a-826b-630e49b3282d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7c778bb748-xvgpm_calico-system(f42d8c07-697c-492a-826b-630e49b3282d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c0902a255ff545f1fc32bf0b7030b6630618a2b24d13aa91fbc3f7e8eac574e1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7c778bb748-xvgpm" podUID="f42d8c07-697c-492a-826b-630e49b3282d" Dec 16 12:18:06.920275 containerd[1662]: time="2025-12-16T12:18:06.920229266Z" level=error msg="Failed to destroy network for sandbox \"9ae4a7bfb8ce443620e4b2433f0118ba07de1c9efbb774dac0e2387251e0918c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:18:06.922416 containerd[1662]: time="2025-12-16T12:18:06.922322793Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d558c8677-8zlvg,Uid:963a1289-7bdf-4dde-a21a-b1e5da9ecfcc,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ae4a7bfb8ce443620e4b2433f0118ba07de1c9efbb774dac0e2387251e0918c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:18:06.922591 kubelet[2932]: E1216 12:18:06.922521 2932 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ae4a7bfb8ce443620e4b2433f0118ba07de1c9efbb774dac0e2387251e0918c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:18:06.922591 kubelet[2932]: E1216 12:18:06.922570 2932 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ae4a7bfb8ce443620e4b2433f0118ba07de1c9efbb774dac0e2387251e0918c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d558c8677-8zlvg" Dec 16 12:18:06.922591 kubelet[2932]: E1216 12:18:06.922586 2932 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9ae4a7bfb8ce443620e4b2433f0118ba07de1c9efbb774dac0e2387251e0918c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-5d558c8677-8zlvg" Dec 16 12:18:06.922797 kubelet[2932]: E1216 12:18:06.922636 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-5d558c8677-8zlvg_calico-apiserver(963a1289-7bdf-4dde-a21a-b1e5da9ecfcc)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-5d558c8677-8zlvg_calico-apiserver(963a1289-7bdf-4dde-a21a-b1e5da9ecfcc)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9ae4a7bfb8ce443620e4b2433f0118ba07de1c9efbb774dac0e2387251e0918c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-5d558c8677-8zlvg" podUID="963a1289-7bdf-4dde-a21a-b1e5da9ecfcc" Dec 16 12:18:07.212268 containerd[1662]: time="2025-12-16T12:18:07.212113841Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Dec 16 12:18:11.567169 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2204487853.mount: Deactivated successfully. Dec 16 12:18:11.593687 containerd[1662]: time="2025-12-16T12:18:11.593629435Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:18:11.594350 containerd[1662]: time="2025-12-16T12:18:11.594298957Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=150930912" Dec 16 12:18:11.595848 containerd[1662]: time="2025-12-16T12:18:11.595799802Z" level=info msg="ImageCreate event name:\"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:18:11.597780 containerd[1662]: time="2025-12-16T12:18:11.597734888Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:18:11.598254 containerd[1662]: time="2025-12-16T12:18:11.598219129Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"150934424\" in 4.386059128s" Dec 16 12:18:11.598300 containerd[1662]: time="2025-12-16T12:18:11.598254050Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\"" Dec 16 12:18:11.610671 containerd[1662]: time="2025-12-16T12:18:11.610612449Z" level=info msg="CreateContainer within sandbox \"4642879d07dc39ea137bdc2c3f8993ff6e427fe9c17da3a821e2fd40781df02a\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Dec 16 12:18:11.622215 containerd[1662]: time="2025-12-16T12:18:11.621338364Z" level=info msg="Container 1c8109d8bd488f625c6c80b2e04bc3b8a1e33130c123b7b7173a85ca57c427e1: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:18:11.622637 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3907973125.mount: Deactivated successfully. Dec 16 12:18:11.631883 containerd[1662]: time="2025-12-16T12:18:11.631834597Z" level=info msg="CreateContainer within sandbox \"4642879d07dc39ea137bdc2c3f8993ff6e427fe9c17da3a821e2fd40781df02a\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"1c8109d8bd488f625c6c80b2e04bc3b8a1e33130c123b7b7173a85ca57c427e1\"" Dec 16 12:18:11.632389 containerd[1662]: time="2025-12-16T12:18:11.632325839Z" level=info msg="StartContainer for \"1c8109d8bd488f625c6c80b2e04bc3b8a1e33130c123b7b7173a85ca57c427e1\"" Dec 16 12:18:11.633949 containerd[1662]: time="2025-12-16T12:18:11.633901084Z" level=info msg="connecting to shim 1c8109d8bd488f625c6c80b2e04bc3b8a1e33130c123b7b7173a85ca57c427e1" address="unix:///run/containerd/s/0ac6c768bdd4270cb5a4414d39a56532d892357106adac15928f12a56512b775" protocol=ttrpc version=3 Dec 16 12:18:11.659227 systemd[1]: Started cri-containerd-1c8109d8bd488f625c6c80b2e04bc3b8a1e33130c123b7b7173a85ca57c427e1.scope - libcontainer container 1c8109d8bd488f625c6c80b2e04bc3b8a1e33130c123b7b7173a85ca57c427e1. Dec 16 12:18:11.733000 audit: BPF prog-id=172 op=LOAD Dec 16 12:18:11.736380 kernel: kauditd_printk_skb: 6 callbacks suppressed Dec 16 12:18:11.736434 kernel: audit: type=1334 audit(1765887491.733:572): prog-id=172 op=LOAD Dec 16 12:18:11.733000 audit[4029]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3491 pid=4029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:11.741213 kernel: audit: type=1300 audit(1765887491.733:572): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3491 pid=4029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:11.741278 kernel: audit: type=1327 audit(1765887491.733:572): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163383130396438626434383866363235633663383062326530346263 Dec 16 12:18:11.733000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163383130396438626434383866363235633663383062326530346263 Dec 16 12:18:11.733000 audit: BPF prog-id=173 op=LOAD Dec 16 12:18:11.745488 kernel: audit: type=1334 audit(1765887491.733:573): prog-id=173 op=LOAD Dec 16 12:18:11.733000 audit[4029]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3491 pid=4029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:11.749383 kernel: audit: type=1300 audit(1765887491.733:573): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3491 pid=4029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:11.749451 kernel: audit: type=1327 audit(1765887491.733:573): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163383130396438626434383866363235633663383062326530346263 Dec 16 12:18:11.733000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163383130396438626434383866363235633663383062326530346263 Dec 16 12:18:11.733000 audit: BPF prog-id=173 op=UNLOAD Dec 16 12:18:11.753648 kernel: audit: type=1334 audit(1765887491.733:574): prog-id=173 op=UNLOAD Dec 16 12:18:11.753695 kernel: audit: type=1300 audit(1765887491.733:574): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3491 pid=4029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:11.733000 audit[4029]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3491 pid=4029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:11.733000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163383130396438626434383866363235633663383062326530346263 Dec 16 12:18:11.760399 kernel: audit: type=1327 audit(1765887491.733:574): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163383130396438626434383866363235633663383062326530346263 Dec 16 12:18:11.760484 kernel: audit: type=1334 audit(1765887491.733:575): prog-id=172 op=UNLOAD Dec 16 12:18:11.733000 audit: BPF prog-id=172 op=UNLOAD Dec 16 12:18:11.733000 audit[4029]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3491 pid=4029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:11.733000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163383130396438626434383866363235633663383062326530346263 Dec 16 12:18:11.733000 audit: BPF prog-id=174 op=LOAD Dec 16 12:18:11.733000 audit[4029]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=3491 pid=4029 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:11.733000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163383130396438626434383866363235633663383062326530346263 Dec 16 12:18:11.769845 containerd[1662]: time="2025-12-16T12:18:11.769761799Z" level=info msg="StartContainer for \"1c8109d8bd488f625c6c80b2e04bc3b8a1e33130c123b7b7173a85ca57c427e1\" returns successfully" Dec 16 12:18:11.908388 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Dec 16 12:18:11.908494 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Dec 16 12:18:12.075834 kubelet[2932]: I1216 12:18:12.075648 2932 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h9ps8\" (UniqueName: \"kubernetes.io/projected/928c4f14-bc1a-4920-8219-21947daefaad-kube-api-access-h9ps8\") pod \"928c4f14-bc1a-4920-8219-21947daefaad\" (UID: \"928c4f14-bc1a-4920-8219-21947daefaad\") " Dec 16 12:18:12.075834 kubelet[2932]: I1216 12:18:12.075697 2932 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/928c4f14-bc1a-4920-8219-21947daefaad-whisker-backend-key-pair\") pod \"928c4f14-bc1a-4920-8219-21947daefaad\" (UID: \"928c4f14-bc1a-4920-8219-21947daefaad\") " Dec 16 12:18:12.075834 kubelet[2932]: I1216 12:18:12.075724 2932 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/928c4f14-bc1a-4920-8219-21947daefaad-whisker-ca-bundle\") pod \"928c4f14-bc1a-4920-8219-21947daefaad\" (UID: \"928c4f14-bc1a-4920-8219-21947daefaad\") " Dec 16 12:18:12.076213 kubelet[2932]: I1216 12:18:12.076190 2932 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/928c4f14-bc1a-4920-8219-21947daefaad-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "928c4f14-bc1a-4920-8219-21947daefaad" (UID: "928c4f14-bc1a-4920-8219-21947daefaad"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Dec 16 12:18:12.079446 kubelet[2932]: I1216 12:18:12.079390 2932 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/928c4f14-bc1a-4920-8219-21947daefaad-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "928c4f14-bc1a-4920-8219-21947daefaad" (UID: "928c4f14-bc1a-4920-8219-21947daefaad"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Dec 16 12:18:12.080160 kubelet[2932]: I1216 12:18:12.080086 2932 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/928c4f14-bc1a-4920-8219-21947daefaad-kube-api-access-h9ps8" (OuterVolumeSpecName: "kube-api-access-h9ps8") pod "928c4f14-bc1a-4920-8219-21947daefaad" (UID: "928c4f14-bc1a-4920-8219-21947daefaad"). InnerVolumeSpecName "kube-api-access-h9ps8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Dec 16 12:18:12.117202 systemd[1]: Removed slice kubepods-besteffort-pod928c4f14_bc1a_4920_8219_21947daefaad.slice - libcontainer container kubepods-besteffort-pod928c4f14_bc1a_4920_8219_21947daefaad.slice. Dec 16 12:18:12.176358 kubelet[2932]: I1216 12:18:12.176256 2932 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/928c4f14-bc1a-4920-8219-21947daefaad-whisker-ca-bundle\") on node \"ci-4547-0-0-5-b12717c6ea\" DevicePath \"\"" Dec 16 12:18:12.176358 kubelet[2932]: I1216 12:18:12.176288 2932 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h9ps8\" (UniqueName: \"kubernetes.io/projected/928c4f14-bc1a-4920-8219-21947daefaad-kube-api-access-h9ps8\") on node \"ci-4547-0-0-5-b12717c6ea\" DevicePath \"\"" Dec 16 12:18:12.176358 kubelet[2932]: I1216 12:18:12.176301 2932 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/928c4f14-bc1a-4920-8219-21947daefaad-whisker-backend-key-pair\") on node \"ci-4547-0-0-5-b12717c6ea\" DevicePath \"\"" Dec 16 12:18:12.242058 kubelet[2932]: I1216 12:18:12.241963 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-nmpjq" podStartSLOduration=1.440357576 podStartE2EDuration="18.241944871s" podCreationTimestamp="2025-12-16 12:17:54 +0000 UTC" firstStartedPulling="2025-12-16 12:17:54.797785078 +0000 UTC m=+22.848700149" lastFinishedPulling="2025-12-16 12:18:11.599372373 +0000 UTC m=+39.650287444" observedRunningTime="2025-12-16 12:18:12.239806344 +0000 UTC m=+40.290721455" watchObservedRunningTime="2025-12-16 12:18:12.241944871 +0000 UTC m=+40.292859982" Dec 16 12:18:12.309541 systemd[1]: Created slice kubepods-besteffort-pod29a341e5_f159_490a_9c2d_ec19b832725b.slice - libcontainer container kubepods-besteffort-pod29a341e5_f159_490a_9c2d_ec19b832725b.slice. Dec 16 12:18:12.378989 kubelet[2932]: I1216 12:18:12.378900 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/29a341e5-f159-490a-9c2d-ec19b832725b-whisker-ca-bundle\") pod \"whisker-74bfdd549c-rv9w7\" (UID: \"29a341e5-f159-490a-9c2d-ec19b832725b\") " pod="calico-system/whisker-74bfdd549c-rv9w7" Dec 16 12:18:12.379215 kubelet[2932]: I1216 12:18:12.379046 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gz2mz\" (UniqueName: \"kubernetes.io/projected/29a341e5-f159-490a-9c2d-ec19b832725b-kube-api-access-gz2mz\") pod \"whisker-74bfdd549c-rv9w7\" (UID: \"29a341e5-f159-490a-9c2d-ec19b832725b\") " pod="calico-system/whisker-74bfdd549c-rv9w7" Dec 16 12:18:12.379215 kubelet[2932]: I1216 12:18:12.379085 2932 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/29a341e5-f159-490a-9c2d-ec19b832725b-whisker-backend-key-pair\") pod \"whisker-74bfdd549c-rv9w7\" (UID: \"29a341e5-f159-490a-9c2d-ec19b832725b\") " pod="calico-system/whisker-74bfdd549c-rv9w7" Dec 16 12:18:12.569861 systemd[1]: var-lib-kubelet-pods-928c4f14\x2dbc1a\x2d4920\x2d8219\x2d21947daefaad-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dh9ps8.mount: Deactivated successfully. Dec 16 12:18:12.569958 systemd[1]: var-lib-kubelet-pods-928c4f14\x2dbc1a\x2d4920\x2d8219\x2d21947daefaad-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Dec 16 12:18:12.617488 containerd[1662]: time="2025-12-16T12:18:12.617437954Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-74bfdd549c-rv9w7,Uid:29a341e5-f159-490a-9c2d-ec19b832725b,Namespace:calico-system,Attempt:0,}" Dec 16 12:18:12.748407 systemd-networkd[1583]: calie326aab28e8: Link UP Dec 16 12:18:12.748912 systemd-networkd[1583]: calie326aab28e8: Gained carrier Dec 16 12:18:12.762279 containerd[1662]: 2025-12-16 12:18:12.641 [INFO][4121] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 12:18:12.762279 containerd[1662]: 2025-12-16 12:18:12.659 [INFO][4121] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--5--b12717c6ea-k8s-whisker--74bfdd549c--rv9w7-eth0 whisker-74bfdd549c- calico-system 29a341e5-f159-490a-9c2d-ec19b832725b 914 0 2025-12-16 12:18:12 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:74bfdd549c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4547-0-0-5-b12717c6ea whisker-74bfdd549c-rv9w7 eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calie326aab28e8 [] [] }} ContainerID="46a266cfdb0878f459540d87431b73dbc5ae215c356bee8dd1ccb32234f42736" Namespace="calico-system" Pod="whisker-74bfdd549c-rv9w7" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-whisker--74bfdd549c--rv9w7-" Dec 16 12:18:12.762279 containerd[1662]: 2025-12-16 12:18:12.659 [INFO][4121] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="46a266cfdb0878f459540d87431b73dbc5ae215c356bee8dd1ccb32234f42736" Namespace="calico-system" Pod="whisker-74bfdd549c-rv9w7" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-whisker--74bfdd549c--rv9w7-eth0" Dec 16 12:18:12.762279 containerd[1662]: 2025-12-16 12:18:12.703 [INFO][4135] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="46a266cfdb0878f459540d87431b73dbc5ae215c356bee8dd1ccb32234f42736" HandleID="k8s-pod-network.46a266cfdb0878f459540d87431b73dbc5ae215c356bee8dd1ccb32234f42736" Workload="ci--4547--0--0--5--b12717c6ea-k8s-whisker--74bfdd549c--rv9w7-eth0" Dec 16 12:18:12.762493 containerd[1662]: 2025-12-16 12:18:12.703 [INFO][4135] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="46a266cfdb0878f459540d87431b73dbc5ae215c356bee8dd1ccb32234f42736" HandleID="k8s-pod-network.46a266cfdb0878f459540d87431b73dbc5ae215c356bee8dd1ccb32234f42736" Workload="ci--4547--0--0--5--b12717c6ea-k8s-whisker--74bfdd549c--rv9w7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001378d0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-5-b12717c6ea", "pod":"whisker-74bfdd549c-rv9w7", "timestamp":"2025-12-16 12:18:12.703272149 +0000 UTC"}, Hostname:"ci-4547-0-0-5-b12717c6ea", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:18:12.762493 containerd[1662]: 2025-12-16 12:18:12.703 [INFO][4135] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:18:12.762493 containerd[1662]: 2025-12-16 12:18:12.703 [INFO][4135] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:18:12.762493 containerd[1662]: 2025-12-16 12:18:12.703 [INFO][4135] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-5-b12717c6ea' Dec 16 12:18:12.762493 containerd[1662]: 2025-12-16 12:18:12.713 [INFO][4135] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.46a266cfdb0878f459540d87431b73dbc5ae215c356bee8dd1ccb32234f42736" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:12.762493 containerd[1662]: 2025-12-16 12:18:12.718 [INFO][4135] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:12.762493 containerd[1662]: 2025-12-16 12:18:12.722 [INFO][4135] ipam/ipam.go 511: Trying affinity for 192.168.79.192/26 host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:12.762493 containerd[1662]: 2025-12-16 12:18:12.724 [INFO][4135] ipam/ipam.go 158: Attempting to load block cidr=192.168.79.192/26 host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:12.762493 containerd[1662]: 2025-12-16 12:18:12.727 [INFO][4135] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.79.192/26 host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:12.762660 containerd[1662]: 2025-12-16 12:18:12.727 [INFO][4135] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.79.192/26 handle="k8s-pod-network.46a266cfdb0878f459540d87431b73dbc5ae215c356bee8dd1ccb32234f42736" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:12.762660 containerd[1662]: 2025-12-16 12:18:12.729 [INFO][4135] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.46a266cfdb0878f459540d87431b73dbc5ae215c356bee8dd1ccb32234f42736 Dec 16 12:18:12.762660 containerd[1662]: 2025-12-16 12:18:12.733 [INFO][4135] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.79.192/26 handle="k8s-pod-network.46a266cfdb0878f459540d87431b73dbc5ae215c356bee8dd1ccb32234f42736" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:12.762660 containerd[1662]: 2025-12-16 12:18:12.738 [INFO][4135] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.79.193/26] block=192.168.79.192/26 handle="k8s-pod-network.46a266cfdb0878f459540d87431b73dbc5ae215c356bee8dd1ccb32234f42736" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:12.762660 containerd[1662]: 2025-12-16 12:18:12.738 [INFO][4135] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.79.193/26] handle="k8s-pod-network.46a266cfdb0878f459540d87431b73dbc5ae215c356bee8dd1ccb32234f42736" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:12.762660 containerd[1662]: 2025-12-16 12:18:12.739 [INFO][4135] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:18:12.762660 containerd[1662]: 2025-12-16 12:18:12.739 [INFO][4135] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.79.193/26] IPv6=[] ContainerID="46a266cfdb0878f459540d87431b73dbc5ae215c356bee8dd1ccb32234f42736" HandleID="k8s-pod-network.46a266cfdb0878f459540d87431b73dbc5ae215c356bee8dd1ccb32234f42736" Workload="ci--4547--0--0--5--b12717c6ea-k8s-whisker--74bfdd549c--rv9w7-eth0" Dec 16 12:18:12.762779 containerd[1662]: 2025-12-16 12:18:12.741 [INFO][4121] cni-plugin/k8s.go 418: Populated endpoint ContainerID="46a266cfdb0878f459540d87431b73dbc5ae215c356bee8dd1ccb32234f42736" Namespace="calico-system" Pod="whisker-74bfdd549c-rv9w7" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-whisker--74bfdd549c--rv9w7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--b12717c6ea-k8s-whisker--74bfdd549c--rv9w7-eth0", GenerateName:"whisker-74bfdd549c-", Namespace:"calico-system", SelfLink:"", UID:"29a341e5-f159-490a-9c2d-ec19b832725b", ResourceVersion:"914", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 18, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"74bfdd549c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-b12717c6ea", ContainerID:"", Pod:"whisker-74bfdd549c-rv9w7", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.79.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calie326aab28e8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:18:12.762779 containerd[1662]: 2025-12-16 12:18:12.741 [INFO][4121] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.79.193/32] ContainerID="46a266cfdb0878f459540d87431b73dbc5ae215c356bee8dd1ccb32234f42736" Namespace="calico-system" Pod="whisker-74bfdd549c-rv9w7" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-whisker--74bfdd549c--rv9w7-eth0" Dec 16 12:18:12.762875 containerd[1662]: 2025-12-16 12:18:12.741 [INFO][4121] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calie326aab28e8 ContainerID="46a266cfdb0878f459540d87431b73dbc5ae215c356bee8dd1ccb32234f42736" Namespace="calico-system" Pod="whisker-74bfdd549c-rv9w7" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-whisker--74bfdd549c--rv9w7-eth0" Dec 16 12:18:12.762875 containerd[1662]: 2025-12-16 12:18:12.749 [INFO][4121] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="46a266cfdb0878f459540d87431b73dbc5ae215c356bee8dd1ccb32234f42736" Namespace="calico-system" Pod="whisker-74bfdd549c-rv9w7" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-whisker--74bfdd549c--rv9w7-eth0" Dec 16 12:18:12.762914 containerd[1662]: 2025-12-16 12:18:12.749 [INFO][4121] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="46a266cfdb0878f459540d87431b73dbc5ae215c356bee8dd1ccb32234f42736" Namespace="calico-system" Pod="whisker-74bfdd549c-rv9w7" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-whisker--74bfdd549c--rv9w7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--b12717c6ea-k8s-whisker--74bfdd549c--rv9w7-eth0", GenerateName:"whisker-74bfdd549c-", Namespace:"calico-system", SelfLink:"", UID:"29a341e5-f159-490a-9c2d-ec19b832725b", ResourceVersion:"914", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 18, 12, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"74bfdd549c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-b12717c6ea", ContainerID:"46a266cfdb0878f459540d87431b73dbc5ae215c356bee8dd1ccb32234f42736", Pod:"whisker-74bfdd549c-rv9w7", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.79.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calie326aab28e8", MAC:"2a:66:33:cd:0e:b0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:18:12.762962 containerd[1662]: 2025-12-16 12:18:12.760 [INFO][4121] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="46a266cfdb0878f459540d87431b73dbc5ae215c356bee8dd1ccb32234f42736" Namespace="calico-system" Pod="whisker-74bfdd549c-rv9w7" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-whisker--74bfdd549c--rv9w7-eth0" Dec 16 12:18:12.781318 containerd[1662]: time="2025-12-16T12:18:12.781273679Z" level=info msg="connecting to shim 46a266cfdb0878f459540d87431b73dbc5ae215c356bee8dd1ccb32234f42736" address="unix:///run/containerd/s/bba967b2c697ac344a4b84820222420e9cb442d7162b8e876d27fbce99479187" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:18:12.804051 systemd[1]: Started cri-containerd-46a266cfdb0878f459540d87431b73dbc5ae215c356bee8dd1ccb32234f42736.scope - libcontainer container 46a266cfdb0878f459540d87431b73dbc5ae215c356bee8dd1ccb32234f42736. Dec 16 12:18:12.812000 audit: BPF prog-id=175 op=LOAD Dec 16 12:18:12.813000 audit: BPF prog-id=176 op=LOAD Dec 16 12:18:12.813000 audit[4170]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4158 pid=4170 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:12.813000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436613236366366646230383738663435393534306438373433316237 Dec 16 12:18:12.813000 audit: BPF prog-id=176 op=UNLOAD Dec 16 12:18:12.813000 audit[4170]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4158 pid=4170 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:12.813000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436613236366366646230383738663435393534306438373433316237 Dec 16 12:18:12.813000 audit: BPF prog-id=177 op=LOAD Dec 16 12:18:12.813000 audit[4170]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4158 pid=4170 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:12.813000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436613236366366646230383738663435393534306438373433316237 Dec 16 12:18:12.813000 audit: BPF prog-id=178 op=LOAD Dec 16 12:18:12.813000 audit[4170]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4158 pid=4170 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:12.813000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436613236366366646230383738663435393534306438373433316237 Dec 16 12:18:12.813000 audit: BPF prog-id=178 op=UNLOAD Dec 16 12:18:12.813000 audit[4170]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4158 pid=4170 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:12.813000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436613236366366646230383738663435393534306438373433316237 Dec 16 12:18:12.813000 audit: BPF prog-id=177 op=UNLOAD Dec 16 12:18:12.813000 audit[4170]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4158 pid=4170 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:12.813000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436613236366366646230383738663435393534306438373433316237 Dec 16 12:18:12.813000 audit: BPF prog-id=179 op=LOAD Dec 16 12:18:12.813000 audit[4170]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4158 pid=4170 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:12.813000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3436613236366366646230383738663435393534306438373433316237 Dec 16 12:18:12.835920 containerd[1662]: time="2025-12-16T12:18:12.835725413Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-74bfdd549c-rv9w7,Uid:29a341e5-f159-490a-9c2d-ec19b832725b,Namespace:calico-system,Attempt:0,} returns sandbox id \"46a266cfdb0878f459540d87431b73dbc5ae215c356bee8dd1ccb32234f42736\"" Dec 16 12:18:12.838422 containerd[1662]: time="2025-12-16T12:18:12.838384102Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:18:13.198202 containerd[1662]: time="2025-12-16T12:18:13.198055574Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:18:13.199697 containerd[1662]: time="2025-12-16T12:18:13.199654699Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:18:13.199906 containerd[1662]: time="2025-12-16T12:18:13.199733379Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:18:13.199956 kubelet[2932]: E1216 12:18:13.199904 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:18:13.199956 kubelet[2932]: E1216 12:18:13.199948 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:18:13.200311 kubelet[2932]: E1216 12:18:13.200029 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-74bfdd549c-rv9w7_calico-system(29a341e5-f159-490a-9c2d-ec19b832725b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:18:13.201104 containerd[1662]: time="2025-12-16T12:18:13.200940183Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:18:13.546633 containerd[1662]: time="2025-12-16T12:18:13.546424849Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:18:13.547668 containerd[1662]: time="2025-12-16T12:18:13.547622293Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:18:13.547737 containerd[1662]: time="2025-12-16T12:18:13.547703614Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:18:13.547967 kubelet[2932]: E1216 12:18:13.547917 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:18:13.548031 kubelet[2932]: E1216 12:18:13.547970 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:18:13.548073 kubelet[2932]: E1216 12:18:13.548054 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-74bfdd549c-rv9w7_calico-system(29a341e5-f159-490a-9c2d-ec19b832725b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:18:13.548172 kubelet[2932]: E1216 12:18:13.548095 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-74bfdd549c-rv9w7" podUID="29a341e5-f159-490a-9c2d-ec19b832725b" Dec 16 12:18:14.113953 kubelet[2932]: I1216 12:18:14.113909 2932 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="928c4f14-bc1a-4920-8219-21947daefaad" path="/var/lib/kubelet/pods/928c4f14-bc1a-4920-8219-21947daefaad/volumes" Dec 16 12:18:14.234768 kubelet[2932]: E1216 12:18:14.234662 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-74bfdd549c-rv9w7" podUID="29a341e5-f159-490a-9c2d-ec19b832725b" Dec 16 12:18:14.251000 audit[4327]: NETFILTER_CFG table=filter:117 family=2 entries=22 op=nft_register_rule pid=4327 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:18:14.251000 audit[4327]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=fffff527b380 a2=0 a3=1 items=0 ppid=3084 pid=4327 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:14.251000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:18:14.261000 audit[4327]: NETFILTER_CFG table=nat:118 family=2 entries=12 op=nft_register_rule pid=4327 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:18:14.261000 audit[4327]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff527b380 a2=0 a3=1 items=0 ppid=3084 pid=4327 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:14.261000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:18:14.576415 kubelet[2932]: I1216 12:18:14.576346 2932 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 12:18:14.581328 systemd-networkd[1583]: calie326aab28e8: Gained IPv6LL Dec 16 12:18:15.281000 audit[4353]: NETFILTER_CFG table=filter:119 family=2 entries=21 op=nft_register_rule pid=4353 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:18:15.281000 audit[4353]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffd9135280 a2=0 a3=1 items=0 ppid=3084 pid=4353 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.281000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:18:15.290000 audit[4353]: NETFILTER_CFG table=nat:120 family=2 entries=19 op=nft_register_chain pid=4353 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:18:15.290000 audit[4353]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6276 a0=3 a1=ffffd9135280 a2=0 a3=1 items=0 ppid=3084 pid=4353 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.290000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:18:15.534000 audit: BPF prog-id=180 op=LOAD Dec 16 12:18:15.534000 audit[4388]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffde2ad338 a2=98 a3=ffffde2ad328 items=0 ppid=4354 pid=4388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.534000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:18:15.534000 audit: BPF prog-id=180 op=UNLOAD Dec 16 12:18:15.534000 audit[4388]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffde2ad308 a3=0 items=0 ppid=4354 pid=4388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.534000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:18:15.534000 audit: BPF prog-id=181 op=LOAD Dec 16 12:18:15.534000 audit[4388]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffde2ad1e8 a2=74 a3=95 items=0 ppid=4354 pid=4388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.534000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:18:15.535000 audit: BPF prog-id=181 op=UNLOAD Dec 16 12:18:15.535000 audit[4388]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4354 pid=4388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.535000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:18:15.535000 audit: BPF prog-id=182 op=LOAD Dec 16 12:18:15.535000 audit[4388]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffde2ad218 a2=40 a3=ffffde2ad248 items=0 ppid=4354 pid=4388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.535000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:18:15.535000 audit: BPF prog-id=182 op=UNLOAD Dec 16 12:18:15.535000 audit[4388]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffde2ad248 items=0 ppid=4354 pid=4388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.535000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:18:15.538000 audit: BPF prog-id=183 op=LOAD Dec 16 12:18:15.538000 audit[4389]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffddfe3428 a2=98 a3=ffffddfe3418 items=0 ppid=4354 pid=4389 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.538000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:18:15.538000 audit: BPF prog-id=183 op=UNLOAD Dec 16 12:18:15.538000 audit[4389]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffddfe33f8 a3=0 items=0 ppid=4354 pid=4389 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.538000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:18:15.539000 audit: BPF prog-id=184 op=LOAD Dec 16 12:18:15.539000 audit[4389]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffddfe30b8 a2=74 a3=95 items=0 ppid=4354 pid=4389 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.539000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:18:15.539000 audit: BPF prog-id=184 op=UNLOAD Dec 16 12:18:15.539000 audit[4389]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4354 pid=4389 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.539000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:18:15.539000 audit: BPF prog-id=185 op=LOAD Dec 16 12:18:15.539000 audit[4389]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffddfe3118 a2=94 a3=2 items=0 ppid=4354 pid=4389 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.539000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:18:15.539000 audit: BPF prog-id=185 op=UNLOAD Dec 16 12:18:15.539000 audit[4389]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4354 pid=4389 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.539000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:18:15.650000 audit: BPF prog-id=186 op=LOAD Dec 16 12:18:15.650000 audit[4389]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffddfe30d8 a2=40 a3=ffffddfe3108 items=0 ppid=4354 pid=4389 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.650000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:18:15.651000 audit: BPF prog-id=186 op=UNLOAD Dec 16 12:18:15.651000 audit[4389]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=ffffddfe3108 items=0 ppid=4354 pid=4389 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.651000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:18:15.660000 audit: BPF prog-id=187 op=LOAD Dec 16 12:18:15.660000 audit[4389]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffddfe30e8 a2=94 a3=4 items=0 ppid=4354 pid=4389 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.660000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:18:15.660000 audit: BPF prog-id=187 op=UNLOAD Dec 16 12:18:15.660000 audit[4389]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4354 pid=4389 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.660000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:18:15.661000 audit: BPF prog-id=188 op=LOAD Dec 16 12:18:15.661000 audit[4389]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffddfe2f28 a2=94 a3=5 items=0 ppid=4354 pid=4389 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.661000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:18:15.661000 audit: BPF prog-id=188 op=UNLOAD Dec 16 12:18:15.661000 audit[4389]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4354 pid=4389 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.661000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:18:15.661000 audit: BPF prog-id=189 op=LOAD Dec 16 12:18:15.661000 audit[4389]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffddfe3158 a2=94 a3=6 items=0 ppid=4354 pid=4389 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.661000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:18:15.661000 audit: BPF prog-id=189 op=UNLOAD Dec 16 12:18:15.661000 audit[4389]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4354 pid=4389 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.661000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:18:15.661000 audit: BPF prog-id=190 op=LOAD Dec 16 12:18:15.661000 audit[4389]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffddfe2928 a2=94 a3=83 items=0 ppid=4354 pid=4389 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.661000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:18:15.661000 audit: BPF prog-id=191 op=LOAD Dec 16 12:18:15.661000 audit[4389]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=ffffddfe26e8 a2=94 a3=2 items=0 ppid=4354 pid=4389 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.661000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:18:15.661000 audit: BPF prog-id=191 op=UNLOAD Dec 16 12:18:15.661000 audit[4389]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4354 pid=4389 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.661000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:18:15.662000 audit: BPF prog-id=190 op=UNLOAD Dec 16 12:18:15.662000 audit[4389]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=1029e620 a3=10291b00 items=0 ppid=4354 pid=4389 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.662000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:18:15.671000 audit: BPF prog-id=192 op=LOAD Dec 16 12:18:15.671000 audit[4414]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffecd31858 a2=98 a3=ffffecd31848 items=0 ppid=4354 pid=4414 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.671000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:18:15.671000 audit: BPF prog-id=192 op=UNLOAD Dec 16 12:18:15.671000 audit[4414]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffecd31828 a3=0 items=0 ppid=4354 pid=4414 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.671000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:18:15.671000 audit: BPF prog-id=193 op=LOAD Dec 16 12:18:15.671000 audit[4414]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffecd31708 a2=74 a3=95 items=0 ppid=4354 pid=4414 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.671000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:18:15.671000 audit: BPF prog-id=193 op=UNLOAD Dec 16 12:18:15.671000 audit[4414]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4354 pid=4414 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.671000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:18:15.671000 audit: BPF prog-id=194 op=LOAD Dec 16 12:18:15.671000 audit[4414]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffecd31738 a2=40 a3=ffffecd31768 items=0 ppid=4354 pid=4414 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.671000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:18:15.671000 audit: BPF prog-id=194 op=UNLOAD Dec 16 12:18:15.671000 audit[4414]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffecd31768 items=0 ppid=4354 pid=4414 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.671000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:18:15.729607 systemd-networkd[1583]: vxlan.calico: Link UP Dec 16 12:18:15.729617 systemd-networkd[1583]: vxlan.calico: Gained carrier Dec 16 12:18:15.745000 audit: BPF prog-id=195 op=LOAD Dec 16 12:18:15.745000 audit[4443]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffff50e10f8 a2=98 a3=fffff50e10e8 items=0 ppid=4354 pid=4443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.745000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:18:15.745000 audit: BPF prog-id=195 op=UNLOAD Dec 16 12:18:15.745000 audit[4443]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=fffff50e10c8 a3=0 items=0 ppid=4354 pid=4443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.745000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:18:15.745000 audit: BPF prog-id=196 op=LOAD Dec 16 12:18:15.745000 audit[4443]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffff50e0dd8 a2=74 a3=95 items=0 ppid=4354 pid=4443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.745000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:18:15.745000 audit: BPF prog-id=196 op=UNLOAD Dec 16 12:18:15.745000 audit[4443]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4354 pid=4443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.745000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:18:15.745000 audit: BPF prog-id=197 op=LOAD Dec 16 12:18:15.745000 audit[4443]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffff50e0e38 a2=94 a3=2 items=0 ppid=4354 pid=4443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.745000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:18:15.745000 audit: BPF prog-id=197 op=UNLOAD Dec 16 12:18:15.745000 audit[4443]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=70 a3=2 items=0 ppid=4354 pid=4443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.745000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:18:15.745000 audit: BPF prog-id=198 op=LOAD Dec 16 12:18:15.745000 audit[4443]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=fffff50e0cb8 a2=40 a3=fffff50e0ce8 items=0 ppid=4354 pid=4443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.745000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:18:15.745000 audit: BPF prog-id=198 op=UNLOAD Dec 16 12:18:15.745000 audit[4443]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=40 a3=fffff50e0ce8 items=0 ppid=4354 pid=4443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.745000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:18:15.745000 audit: BPF prog-id=199 op=LOAD Dec 16 12:18:15.745000 audit[4443]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=fffff50e0e08 a2=94 a3=b7 items=0 ppid=4354 pid=4443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.745000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:18:15.745000 audit: BPF prog-id=199 op=UNLOAD Dec 16 12:18:15.745000 audit[4443]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=b7 items=0 ppid=4354 pid=4443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.745000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:18:15.746000 audit: BPF prog-id=200 op=LOAD Dec 16 12:18:15.746000 audit[4443]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=fffff50e04b8 a2=94 a3=2 items=0 ppid=4354 pid=4443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.746000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:18:15.746000 audit: BPF prog-id=200 op=UNLOAD Dec 16 12:18:15.746000 audit[4443]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=2 items=0 ppid=4354 pid=4443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.746000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:18:15.746000 audit: BPF prog-id=201 op=LOAD Dec 16 12:18:15.746000 audit[4443]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=fffff50e0648 a2=94 a3=30 items=0 ppid=4354 pid=4443 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.746000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:18:15.749000 audit: BPF prog-id=202 op=LOAD Dec 16 12:18:15.749000 audit[4445]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffefab8f38 a2=98 a3=ffffefab8f28 items=0 ppid=4354 pid=4445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.749000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:18:15.750000 audit: BPF prog-id=202 op=UNLOAD Dec 16 12:18:15.750000 audit[4445]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffefab8f08 a3=0 items=0 ppid=4354 pid=4445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.750000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:18:15.750000 audit: BPF prog-id=203 op=LOAD Dec 16 12:18:15.750000 audit[4445]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffefab8bc8 a2=74 a3=95 items=0 ppid=4354 pid=4445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.750000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:18:15.750000 audit: BPF prog-id=203 op=UNLOAD Dec 16 12:18:15.750000 audit[4445]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4354 pid=4445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.750000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:18:15.750000 audit: BPF prog-id=204 op=LOAD Dec 16 12:18:15.750000 audit[4445]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffefab8c28 a2=94 a3=2 items=0 ppid=4354 pid=4445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.750000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:18:15.750000 audit: BPF prog-id=204 op=UNLOAD Dec 16 12:18:15.750000 audit[4445]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4354 pid=4445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.750000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:18:15.852000 audit: BPF prog-id=205 op=LOAD Dec 16 12:18:15.852000 audit[4445]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffefab8be8 a2=40 a3=ffffefab8c18 items=0 ppid=4354 pid=4445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.852000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:18:15.853000 audit: BPF prog-id=205 op=UNLOAD Dec 16 12:18:15.853000 audit[4445]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=ffffefab8c18 items=0 ppid=4354 pid=4445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.853000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:18:15.862000 audit: BPF prog-id=206 op=LOAD Dec 16 12:18:15.862000 audit[4445]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffefab8bf8 a2=94 a3=4 items=0 ppid=4354 pid=4445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.862000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:18:15.862000 audit: BPF prog-id=206 op=UNLOAD Dec 16 12:18:15.862000 audit[4445]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4354 pid=4445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.862000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:18:15.862000 audit: BPF prog-id=207 op=LOAD Dec 16 12:18:15.862000 audit[4445]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffefab8a38 a2=94 a3=5 items=0 ppid=4354 pid=4445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.862000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:18:15.862000 audit: BPF prog-id=207 op=UNLOAD Dec 16 12:18:15.862000 audit[4445]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4354 pid=4445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.862000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:18:15.862000 audit: BPF prog-id=208 op=LOAD Dec 16 12:18:15.862000 audit[4445]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffefab8c68 a2=94 a3=6 items=0 ppid=4354 pid=4445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.862000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:18:15.862000 audit: BPF prog-id=208 op=UNLOAD Dec 16 12:18:15.862000 audit[4445]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4354 pid=4445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.862000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:18:15.863000 audit: BPF prog-id=209 op=LOAD Dec 16 12:18:15.863000 audit[4445]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffefab8438 a2=94 a3=83 items=0 ppid=4354 pid=4445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.863000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:18:15.863000 audit: BPF prog-id=210 op=LOAD Dec 16 12:18:15.863000 audit[4445]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=ffffefab81f8 a2=94 a3=2 items=0 ppid=4354 pid=4445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.863000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:18:15.863000 audit: BPF prog-id=210 op=UNLOAD Dec 16 12:18:15.863000 audit[4445]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4354 pid=4445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.863000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:18:15.863000 audit: BPF prog-id=209 op=UNLOAD Dec 16 12:18:15.863000 audit[4445]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=176ba620 a3=176adb00 items=0 ppid=4354 pid=4445 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.863000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:18:15.876000 audit: BPF prog-id=201 op=UNLOAD Dec 16 12:18:15.876000 audit[4354]: SYSCALL arch=c00000b7 syscall=35 success=yes exit=0 a0=ffffffffffffff9c a1=4000738640 a2=0 a3=0 items=0 ppid=4199 pid=4354 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.876000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Dec 16 12:18:15.914000 audit[4471]: NETFILTER_CFG table=mangle:121 family=2 entries=16 op=nft_register_chain pid=4471 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:18:15.914000 audit[4471]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6868 a0=3 a1=ffffefb1a820 a2=0 a3=ffffacf6bfa8 items=0 ppid=4354 pid=4471 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.914000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:18:15.918000 audit[4475]: NETFILTER_CFG table=nat:122 family=2 entries=15 op=nft_register_chain pid=4475 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:18:15.918000 audit[4475]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5084 a0=3 a1=ffffe4751f80 a2=0 a3=ffff8ac86fa8 items=0 ppid=4354 pid=4475 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.918000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:18:15.921000 audit[4470]: NETFILTER_CFG table=raw:123 family=2 entries=21 op=nft_register_chain pid=4470 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:18:15.921000 audit[4470]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8452 a0=3 a1=ffffd446c7f0 a2=0 a3=ffff8a828fa8 items=0 ppid=4354 pid=4470 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.921000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:18:15.922000 audit[4472]: NETFILTER_CFG table=filter:124 family=2 entries=94 op=nft_register_chain pid=4472 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:18:15.922000 audit[4472]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=53116 a0=3 a1=ffffc04750d0 a2=0 a3=ffffb6c92fa8 items=0 ppid=4354 pid=4472 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.922000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:18:17.589087 systemd-networkd[1583]: vxlan.calico: Gained IPv6LL Dec 16 12:18:19.112338 containerd[1662]: time="2025-12-16T12:18:19.112205716Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-xvgpm,Uid:f42d8c07-697c-492a-826b-630e49b3282d,Namespace:calico-system,Attempt:0,}" Dec 16 12:18:19.114085 containerd[1662]: time="2025-12-16T12:18:19.113943602Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d558c8677-khwzf,Uid:c36f8480-80e5-4b1d-b264-0fc54133bb26,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:18:19.226168 systemd-networkd[1583]: calia10707dd317: Link UP Dec 16 12:18:19.226473 systemd-networkd[1583]: calia10707dd317: Gained carrier Dec 16 12:18:19.243792 containerd[1662]: 2025-12-16 12:18:19.160 [INFO][4499] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--5--b12717c6ea-k8s-calico--apiserver--5d558c8677--khwzf-eth0 calico-apiserver-5d558c8677- calico-apiserver c36f8480-80e5-4b1d-b264-0fc54133bb26 849 0 2025-12-16 12:17:47 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5d558c8677 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4547-0-0-5-b12717c6ea calico-apiserver-5d558c8677-khwzf eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia10707dd317 [] [] }} ContainerID="bb80143beff0352ed387bf921e431e11edf7fbf68d4960d98da53887ae5edc6c" Namespace="calico-apiserver" Pod="calico-apiserver-5d558c8677-khwzf" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-calico--apiserver--5d558c8677--khwzf-" Dec 16 12:18:19.243792 containerd[1662]: 2025-12-16 12:18:19.160 [INFO][4499] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bb80143beff0352ed387bf921e431e11edf7fbf68d4960d98da53887ae5edc6c" Namespace="calico-apiserver" Pod="calico-apiserver-5d558c8677-khwzf" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-calico--apiserver--5d558c8677--khwzf-eth0" Dec 16 12:18:19.243792 containerd[1662]: 2025-12-16 12:18:19.182 [INFO][4517] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bb80143beff0352ed387bf921e431e11edf7fbf68d4960d98da53887ae5edc6c" HandleID="k8s-pod-network.bb80143beff0352ed387bf921e431e11edf7fbf68d4960d98da53887ae5edc6c" Workload="ci--4547--0--0--5--b12717c6ea-k8s-calico--apiserver--5d558c8677--khwzf-eth0" Dec 16 12:18:19.244020 containerd[1662]: 2025-12-16 12:18:19.182 [INFO][4517] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="bb80143beff0352ed387bf921e431e11edf7fbf68d4960d98da53887ae5edc6c" HandleID="k8s-pod-network.bb80143beff0352ed387bf921e431e11edf7fbf68d4960d98da53887ae5edc6c" Workload="ci--4547--0--0--5--b12717c6ea-k8s-calico--apiserver--5d558c8677--khwzf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c2fd0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4547-0-0-5-b12717c6ea", "pod":"calico-apiserver-5d558c8677-khwzf", "timestamp":"2025-12-16 12:18:19.182302581 +0000 UTC"}, Hostname:"ci-4547-0-0-5-b12717c6ea", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:18:19.244020 containerd[1662]: 2025-12-16 12:18:19.182 [INFO][4517] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:18:19.244020 containerd[1662]: 2025-12-16 12:18:19.182 [INFO][4517] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:18:19.244020 containerd[1662]: 2025-12-16 12:18:19.182 [INFO][4517] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-5-b12717c6ea' Dec 16 12:18:19.244020 containerd[1662]: 2025-12-16 12:18:19.192 [INFO][4517] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bb80143beff0352ed387bf921e431e11edf7fbf68d4960d98da53887ae5edc6c" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:19.244020 containerd[1662]: 2025-12-16 12:18:19.198 [INFO][4517] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:19.244020 containerd[1662]: 2025-12-16 12:18:19.203 [INFO][4517] ipam/ipam.go 511: Trying affinity for 192.168.79.192/26 host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:19.244020 containerd[1662]: 2025-12-16 12:18:19.205 [INFO][4517] ipam/ipam.go 158: Attempting to load block cidr=192.168.79.192/26 host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:19.244020 containerd[1662]: 2025-12-16 12:18:19.207 [INFO][4517] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.79.192/26 host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:19.244233 containerd[1662]: 2025-12-16 12:18:19.208 [INFO][4517] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.79.192/26 handle="k8s-pod-network.bb80143beff0352ed387bf921e431e11edf7fbf68d4960d98da53887ae5edc6c" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:19.244233 containerd[1662]: 2025-12-16 12:18:19.209 [INFO][4517] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.bb80143beff0352ed387bf921e431e11edf7fbf68d4960d98da53887ae5edc6c Dec 16 12:18:19.244233 containerd[1662]: 2025-12-16 12:18:19.213 [INFO][4517] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.79.192/26 handle="k8s-pod-network.bb80143beff0352ed387bf921e431e11edf7fbf68d4960d98da53887ae5edc6c" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:19.244233 containerd[1662]: 2025-12-16 12:18:19.219 [INFO][4517] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.79.194/26] block=192.168.79.192/26 handle="k8s-pod-network.bb80143beff0352ed387bf921e431e11edf7fbf68d4960d98da53887ae5edc6c" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:19.244233 containerd[1662]: 2025-12-16 12:18:19.219 [INFO][4517] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.79.194/26] handle="k8s-pod-network.bb80143beff0352ed387bf921e431e11edf7fbf68d4960d98da53887ae5edc6c" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:19.244233 containerd[1662]: 2025-12-16 12:18:19.219 [INFO][4517] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:18:19.244233 containerd[1662]: 2025-12-16 12:18:19.219 [INFO][4517] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.79.194/26] IPv6=[] ContainerID="bb80143beff0352ed387bf921e431e11edf7fbf68d4960d98da53887ae5edc6c" HandleID="k8s-pod-network.bb80143beff0352ed387bf921e431e11edf7fbf68d4960d98da53887ae5edc6c" Workload="ci--4547--0--0--5--b12717c6ea-k8s-calico--apiserver--5d558c8677--khwzf-eth0" Dec 16 12:18:19.244367 containerd[1662]: 2025-12-16 12:18:19.221 [INFO][4499] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bb80143beff0352ed387bf921e431e11edf7fbf68d4960d98da53887ae5edc6c" Namespace="calico-apiserver" Pod="calico-apiserver-5d558c8677-khwzf" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-calico--apiserver--5d558c8677--khwzf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--b12717c6ea-k8s-calico--apiserver--5d558c8677--khwzf-eth0", GenerateName:"calico-apiserver-5d558c8677-", Namespace:"calico-apiserver", SelfLink:"", UID:"c36f8480-80e5-4b1d-b264-0fc54133bb26", ResourceVersion:"849", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 17, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5d558c8677", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-b12717c6ea", ContainerID:"", Pod:"calico-apiserver-5d558c8677-khwzf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.79.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia10707dd317", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:18:19.244429 containerd[1662]: 2025-12-16 12:18:19.222 [INFO][4499] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.79.194/32] ContainerID="bb80143beff0352ed387bf921e431e11edf7fbf68d4960d98da53887ae5edc6c" Namespace="calico-apiserver" Pod="calico-apiserver-5d558c8677-khwzf" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-calico--apiserver--5d558c8677--khwzf-eth0" Dec 16 12:18:19.244429 containerd[1662]: 2025-12-16 12:18:19.222 [INFO][4499] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia10707dd317 ContainerID="bb80143beff0352ed387bf921e431e11edf7fbf68d4960d98da53887ae5edc6c" Namespace="calico-apiserver" Pod="calico-apiserver-5d558c8677-khwzf" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-calico--apiserver--5d558c8677--khwzf-eth0" Dec 16 12:18:19.244429 containerd[1662]: 2025-12-16 12:18:19.226 [INFO][4499] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bb80143beff0352ed387bf921e431e11edf7fbf68d4960d98da53887ae5edc6c" Namespace="calico-apiserver" Pod="calico-apiserver-5d558c8677-khwzf" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-calico--apiserver--5d558c8677--khwzf-eth0" Dec 16 12:18:19.244492 containerd[1662]: 2025-12-16 12:18:19.229 [INFO][4499] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bb80143beff0352ed387bf921e431e11edf7fbf68d4960d98da53887ae5edc6c" Namespace="calico-apiserver" Pod="calico-apiserver-5d558c8677-khwzf" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-calico--apiserver--5d558c8677--khwzf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--b12717c6ea-k8s-calico--apiserver--5d558c8677--khwzf-eth0", GenerateName:"calico-apiserver-5d558c8677-", Namespace:"calico-apiserver", SelfLink:"", UID:"c36f8480-80e5-4b1d-b264-0fc54133bb26", ResourceVersion:"849", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 17, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5d558c8677", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-b12717c6ea", ContainerID:"bb80143beff0352ed387bf921e431e11edf7fbf68d4960d98da53887ae5edc6c", Pod:"calico-apiserver-5d558c8677-khwzf", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.79.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia10707dd317", MAC:"e2:e0:7a:75:8c:fb", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:18:19.244538 containerd[1662]: 2025-12-16 12:18:19.241 [INFO][4499] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bb80143beff0352ed387bf921e431e11edf7fbf68d4960d98da53887ae5edc6c" Namespace="calico-apiserver" Pod="calico-apiserver-5d558c8677-khwzf" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-calico--apiserver--5d558c8677--khwzf-eth0" Dec 16 12:18:19.252000 audit[4545]: NETFILTER_CFG table=filter:125 family=2 entries=50 op=nft_register_chain pid=4545 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:18:19.255162 kernel: kauditd_printk_skb: 237 callbacks suppressed Dec 16 12:18:19.255221 kernel: audit: type=1325 audit(1765887499.252:655): table=filter:125 family=2 entries=50 op=nft_register_chain pid=4545 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:18:19.252000 audit[4545]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=28208 a0=3 a1=fffff210ebe0 a2=0 a3=ffff9a2defa8 items=0 ppid=4354 pid=4545 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:19.261672 kernel: audit: type=1300 audit(1765887499.252:655): arch=c00000b7 syscall=211 success=yes exit=28208 a0=3 a1=fffff210ebe0 a2=0 a3=ffff9a2defa8 items=0 ppid=4354 pid=4545 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:19.261742 kernel: audit: type=1327 audit(1765887499.252:655): proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:18:19.252000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:18:19.267707 containerd[1662]: time="2025-12-16T12:18:19.267666574Z" level=info msg="connecting to shim bb80143beff0352ed387bf921e431e11edf7fbf68d4960d98da53887ae5edc6c" address="unix:///run/containerd/s/006dfa698f7bc2287bfeff8170b764bc4aae411d422f9a21851bd1e1042a01b3" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:18:19.291026 systemd[1]: Started cri-containerd-bb80143beff0352ed387bf921e431e11edf7fbf68d4960d98da53887ae5edc6c.scope - libcontainer container bb80143beff0352ed387bf921e431e11edf7fbf68d4960d98da53887ae5edc6c. Dec 16 12:18:19.302000 audit: BPF prog-id=211 op=LOAD Dec 16 12:18:19.305831 kernel: audit: type=1334 audit(1765887499.302:656): prog-id=211 op=LOAD Dec 16 12:18:19.304000 audit: BPF prog-id=212 op=LOAD Dec 16 12:18:19.304000 audit[4567]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4555 pid=4567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:19.310132 kernel: audit: type=1334 audit(1765887499.304:657): prog-id=212 op=LOAD Dec 16 12:18:19.310266 kernel: audit: type=1300 audit(1765887499.304:657): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4555 pid=4567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:19.304000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262383031343362656666303335326564333837626639323165343331 Dec 16 12:18:19.313879 kernel: audit: type=1327 audit(1765887499.304:657): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262383031343362656666303335326564333837626639323165343331 Dec 16 12:18:19.304000 audit: BPF prog-id=212 op=UNLOAD Dec 16 12:18:19.304000 audit[4567]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4555 pid=4567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:19.317852 kernel: audit: type=1334 audit(1765887499.304:658): prog-id=212 op=UNLOAD Dec 16 12:18:19.317949 kernel: audit: type=1300 audit(1765887499.304:658): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4555 pid=4567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:19.304000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262383031343362656666303335326564333837626639323165343331 Dec 16 12:18:19.321039 kernel: audit: type=1327 audit(1765887499.304:658): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262383031343362656666303335326564333837626639323165343331 Dec 16 12:18:19.304000 audit: BPF prog-id=213 op=LOAD Dec 16 12:18:19.304000 audit[4567]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4555 pid=4567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:19.304000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262383031343362656666303335326564333837626639323165343331 Dec 16 12:18:19.306000 audit: BPF prog-id=214 op=LOAD Dec 16 12:18:19.306000 audit[4567]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4555 pid=4567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:19.306000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262383031343362656666303335326564333837626639323165343331 Dec 16 12:18:19.309000 audit: BPF prog-id=214 op=UNLOAD Dec 16 12:18:19.309000 audit[4567]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4555 pid=4567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:19.309000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262383031343362656666303335326564333837626639323165343331 Dec 16 12:18:19.309000 audit: BPF prog-id=213 op=UNLOAD Dec 16 12:18:19.309000 audit[4567]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4555 pid=4567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:19.309000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262383031343362656666303335326564333837626639323165343331 Dec 16 12:18:19.309000 audit: BPF prog-id=215 op=LOAD Dec 16 12:18:19.309000 audit[4567]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4555 pid=4567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:19.309000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262383031343362656666303335326564333837626639323165343331 Dec 16 12:18:19.344272 systemd-networkd[1583]: cali4e4a1e5dc9d: Link UP Dec 16 12:18:19.345442 systemd-networkd[1583]: cali4e4a1e5dc9d: Gained carrier Dec 16 12:18:19.354776 containerd[1662]: time="2025-12-16T12:18:19.354702413Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d558c8677-khwzf,Uid:c36f8480-80e5-4b1d-b264-0fc54133bb26,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"bb80143beff0352ed387bf921e431e11edf7fbf68d4960d98da53887ae5edc6c\"" Dec 16 12:18:19.357178 containerd[1662]: time="2025-12-16T12:18:19.356604139Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:18:19.361996 containerd[1662]: 2025-12-16 12:18:19.162 [INFO][4487] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--5--b12717c6ea-k8s-goldmane--7c778bb748--xvgpm-eth0 goldmane-7c778bb748- calico-system f42d8c07-697c-492a-826b-630e49b3282d 852 0 2025-12-16 12:17:52 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7c778bb748 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4547-0-0-5-b12717c6ea goldmane-7c778bb748-xvgpm eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali4e4a1e5dc9d [] [] }} ContainerID="0105c5953b24b4fbea82e14a393c5b7872aad8f4b0cd84e7acb524f38fdee9c8" Namespace="calico-system" Pod="goldmane-7c778bb748-xvgpm" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-goldmane--7c778bb748--xvgpm-" Dec 16 12:18:19.361996 containerd[1662]: 2025-12-16 12:18:19.162 [INFO][4487] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0105c5953b24b4fbea82e14a393c5b7872aad8f4b0cd84e7acb524f38fdee9c8" Namespace="calico-system" Pod="goldmane-7c778bb748-xvgpm" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-goldmane--7c778bb748--xvgpm-eth0" Dec 16 12:18:19.361996 containerd[1662]: 2025-12-16 12:18:19.183 [INFO][4523] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0105c5953b24b4fbea82e14a393c5b7872aad8f4b0cd84e7acb524f38fdee9c8" HandleID="k8s-pod-network.0105c5953b24b4fbea82e14a393c5b7872aad8f4b0cd84e7acb524f38fdee9c8" Workload="ci--4547--0--0--5--b12717c6ea-k8s-goldmane--7c778bb748--xvgpm-eth0" Dec 16 12:18:19.362219 containerd[1662]: 2025-12-16 12:18:19.183 [INFO][4523] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="0105c5953b24b4fbea82e14a393c5b7872aad8f4b0cd84e7acb524f38fdee9c8" HandleID="k8s-pod-network.0105c5953b24b4fbea82e14a393c5b7872aad8f4b0cd84e7acb524f38fdee9c8" Workload="ci--4547--0--0--5--b12717c6ea-k8s-goldmane--7c778bb748--xvgpm-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000117370), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-5-b12717c6ea", "pod":"goldmane-7c778bb748-xvgpm", "timestamp":"2025-12-16 12:18:19.183535985 +0000 UTC"}, Hostname:"ci-4547-0-0-5-b12717c6ea", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:18:19.362219 containerd[1662]: 2025-12-16 12:18:19.183 [INFO][4523] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:18:19.362219 containerd[1662]: 2025-12-16 12:18:19.219 [INFO][4523] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:18:19.362219 containerd[1662]: 2025-12-16 12:18:19.221 [INFO][4523] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-5-b12717c6ea' Dec 16 12:18:19.362219 containerd[1662]: 2025-12-16 12:18:19.293 [INFO][4523] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0105c5953b24b4fbea82e14a393c5b7872aad8f4b0cd84e7acb524f38fdee9c8" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:19.362219 containerd[1662]: 2025-12-16 12:18:19.298 [INFO][4523] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:19.362219 containerd[1662]: 2025-12-16 12:18:19.305 [INFO][4523] ipam/ipam.go 511: Trying affinity for 192.168.79.192/26 host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:19.362219 containerd[1662]: 2025-12-16 12:18:19.311 [INFO][4523] ipam/ipam.go 158: Attempting to load block cidr=192.168.79.192/26 host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:19.362219 containerd[1662]: 2025-12-16 12:18:19.313 [INFO][4523] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.79.192/26 host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:19.362586 containerd[1662]: 2025-12-16 12:18:19.313 [INFO][4523] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.79.192/26 handle="k8s-pod-network.0105c5953b24b4fbea82e14a393c5b7872aad8f4b0cd84e7acb524f38fdee9c8" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:19.362586 containerd[1662]: 2025-12-16 12:18:19.318 [INFO][4523] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.0105c5953b24b4fbea82e14a393c5b7872aad8f4b0cd84e7acb524f38fdee9c8 Dec 16 12:18:19.362586 containerd[1662]: 2025-12-16 12:18:19.323 [INFO][4523] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.79.192/26 handle="k8s-pod-network.0105c5953b24b4fbea82e14a393c5b7872aad8f4b0cd84e7acb524f38fdee9c8" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:19.362586 containerd[1662]: 2025-12-16 12:18:19.333 [INFO][4523] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.79.195/26] block=192.168.79.192/26 handle="k8s-pod-network.0105c5953b24b4fbea82e14a393c5b7872aad8f4b0cd84e7acb524f38fdee9c8" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:19.362586 containerd[1662]: 2025-12-16 12:18:19.336 [INFO][4523] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.79.195/26] handle="k8s-pod-network.0105c5953b24b4fbea82e14a393c5b7872aad8f4b0cd84e7acb524f38fdee9c8" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:19.362586 containerd[1662]: 2025-12-16 12:18:19.336 [INFO][4523] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:18:19.362586 containerd[1662]: 2025-12-16 12:18:19.336 [INFO][4523] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.79.195/26] IPv6=[] ContainerID="0105c5953b24b4fbea82e14a393c5b7872aad8f4b0cd84e7acb524f38fdee9c8" HandleID="k8s-pod-network.0105c5953b24b4fbea82e14a393c5b7872aad8f4b0cd84e7acb524f38fdee9c8" Workload="ci--4547--0--0--5--b12717c6ea-k8s-goldmane--7c778bb748--xvgpm-eth0" Dec 16 12:18:19.362825 containerd[1662]: 2025-12-16 12:18:19.340 [INFO][4487] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0105c5953b24b4fbea82e14a393c5b7872aad8f4b0cd84e7acb524f38fdee9c8" Namespace="calico-system" Pod="goldmane-7c778bb748-xvgpm" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-goldmane--7c778bb748--xvgpm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--b12717c6ea-k8s-goldmane--7c778bb748--xvgpm-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"f42d8c07-697c-492a-826b-630e49b3282d", ResourceVersion:"852", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 17, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-b12717c6ea", ContainerID:"", Pod:"goldmane-7c778bb748-xvgpm", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.79.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali4e4a1e5dc9d", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:18:19.362896 containerd[1662]: 2025-12-16 12:18:19.340 [INFO][4487] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.79.195/32] ContainerID="0105c5953b24b4fbea82e14a393c5b7872aad8f4b0cd84e7acb524f38fdee9c8" Namespace="calico-system" Pod="goldmane-7c778bb748-xvgpm" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-goldmane--7c778bb748--xvgpm-eth0" Dec 16 12:18:19.362896 containerd[1662]: 2025-12-16 12:18:19.340 [INFO][4487] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4e4a1e5dc9d ContainerID="0105c5953b24b4fbea82e14a393c5b7872aad8f4b0cd84e7acb524f38fdee9c8" Namespace="calico-system" Pod="goldmane-7c778bb748-xvgpm" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-goldmane--7c778bb748--xvgpm-eth0" Dec 16 12:18:19.362896 containerd[1662]: 2025-12-16 12:18:19.346 [INFO][4487] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0105c5953b24b4fbea82e14a393c5b7872aad8f4b0cd84e7acb524f38fdee9c8" Namespace="calico-system" Pod="goldmane-7c778bb748-xvgpm" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-goldmane--7c778bb748--xvgpm-eth0" Dec 16 12:18:19.362961 containerd[1662]: 2025-12-16 12:18:19.346 [INFO][4487] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0105c5953b24b4fbea82e14a393c5b7872aad8f4b0cd84e7acb524f38fdee9c8" Namespace="calico-system" Pod="goldmane-7c778bb748-xvgpm" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-goldmane--7c778bb748--xvgpm-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--b12717c6ea-k8s-goldmane--7c778bb748--xvgpm-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"f42d8c07-697c-492a-826b-630e49b3282d", ResourceVersion:"852", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 17, 52, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-b12717c6ea", ContainerID:"0105c5953b24b4fbea82e14a393c5b7872aad8f4b0cd84e7acb524f38fdee9c8", Pod:"goldmane-7c778bb748-xvgpm", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.79.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali4e4a1e5dc9d", MAC:"1e:a6:44:8b:b0:f2", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:18:19.363009 containerd[1662]: 2025-12-16 12:18:19.358 [INFO][4487] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0105c5953b24b4fbea82e14a393c5b7872aad8f4b0cd84e7acb524f38fdee9c8" Namespace="calico-system" Pod="goldmane-7c778bb748-xvgpm" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-goldmane--7c778bb748--xvgpm-eth0" Dec 16 12:18:19.385000 audit[4600]: NETFILTER_CFG table=filter:126 family=2 entries=48 op=nft_register_chain pid=4600 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:18:19.385000 audit[4600]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=26368 a0=3 a1=ffffcd480000 a2=0 a3=ffffa9b0efa8 items=0 ppid=4354 pid=4600 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:19.385000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:18:19.398093 containerd[1662]: time="2025-12-16T12:18:19.398047152Z" level=info msg="connecting to shim 0105c5953b24b4fbea82e14a393c5b7872aad8f4b0cd84e7acb524f38fdee9c8" address="unix:///run/containerd/s/ba05272c36b85cd0d10d0d2c00a8e11caf518bb1e9ad522b3b0a7198d4ec2518" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:18:19.430240 systemd[1]: Started cri-containerd-0105c5953b24b4fbea82e14a393c5b7872aad8f4b0cd84e7acb524f38fdee9c8.scope - libcontainer container 0105c5953b24b4fbea82e14a393c5b7872aad8f4b0cd84e7acb524f38fdee9c8. Dec 16 12:18:19.442000 audit: BPF prog-id=216 op=LOAD Dec 16 12:18:19.443000 audit: BPF prog-id=217 op=LOAD Dec 16 12:18:19.443000 audit[4620]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4609 pid=4620 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:19.443000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3031303563353935336232346234666265613832653134613339336335 Dec 16 12:18:19.443000 audit: BPF prog-id=217 op=UNLOAD Dec 16 12:18:19.443000 audit[4620]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4609 pid=4620 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:19.443000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3031303563353935336232346234666265613832653134613339336335 Dec 16 12:18:19.444000 audit: BPF prog-id=218 op=LOAD Dec 16 12:18:19.444000 audit[4620]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4609 pid=4620 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:19.444000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3031303563353935336232346234666265613832653134613339336335 Dec 16 12:18:19.444000 audit: BPF prog-id=219 op=LOAD Dec 16 12:18:19.444000 audit[4620]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4609 pid=4620 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:19.444000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3031303563353935336232346234666265613832653134613339336335 Dec 16 12:18:19.444000 audit: BPF prog-id=219 op=UNLOAD Dec 16 12:18:19.444000 audit[4620]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4609 pid=4620 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:19.444000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3031303563353935336232346234666265613832653134613339336335 Dec 16 12:18:19.444000 audit: BPF prog-id=218 op=UNLOAD Dec 16 12:18:19.444000 audit[4620]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4609 pid=4620 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:19.444000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3031303563353935336232346234666265613832653134613339336335 Dec 16 12:18:19.444000 audit: BPF prog-id=220 op=LOAD Dec 16 12:18:19.444000 audit[4620]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4609 pid=4620 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:19.444000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3031303563353935336232346234666265613832653134613339336335 Dec 16 12:18:19.474127 containerd[1662]: time="2025-12-16T12:18:19.474088276Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-xvgpm,Uid:f42d8c07-697c-492a-826b-630e49b3282d,Namespace:calico-system,Attempt:0,} returns sandbox id \"0105c5953b24b4fbea82e14a393c5b7872aad8f4b0cd84e7acb524f38fdee9c8\"" Dec 16 12:18:19.697993 containerd[1662]: time="2025-12-16T12:18:19.697931953Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:18:19.699270 containerd[1662]: time="2025-12-16T12:18:19.699222957Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:18:19.699338 containerd[1662]: time="2025-12-16T12:18:19.699301437Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:18:19.699526 kubelet[2932]: E1216 12:18:19.699449 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:18:19.699526 kubelet[2932]: E1216 12:18:19.699519 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:18:19.699996 kubelet[2932]: E1216 12:18:19.699691 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-5d558c8677-khwzf_calico-apiserver(c36f8480-80e5-4b1d-b264-0fc54133bb26): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:18:19.699996 kubelet[2932]: E1216 12:18:19.699738 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-khwzf" podUID="c36f8480-80e5-4b1d-b264-0fc54133bb26" Dec 16 12:18:19.700064 containerd[1662]: time="2025-12-16T12:18:19.699822839Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:18:20.047108 containerd[1662]: time="2025-12-16T12:18:20.046902430Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:18:20.048368 containerd[1662]: time="2025-12-16T12:18:20.048291875Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:18:20.048368 containerd[1662]: time="2025-12-16T12:18:20.048326715Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:18:20.048548 kubelet[2932]: E1216 12:18:20.048496 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:18:20.048598 kubelet[2932]: E1216 12:18:20.048560 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:18:20.048663 kubelet[2932]: E1216 12:18:20.048643 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-xvgpm_calico-system(f42d8c07-697c-492a-826b-630e49b3282d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:18:20.049585 kubelet[2932]: E1216 12:18:20.048681 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-xvgpm" podUID="f42d8c07-697c-492a-826b-630e49b3282d" Dec 16 12:18:20.112898 containerd[1662]: time="2025-12-16T12:18:20.112859322Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tp52x,Uid:1d6077b5-49a5-4d21-bd6b-0ffae41c4da0,Namespace:calico-system,Attempt:0,}" Dec 16 12:18:20.114606 containerd[1662]: time="2025-12-16T12:18:20.114558087Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-x2qrf,Uid:ba1fed23-827f-4247-97a8-f8bd8998b8f8,Namespace:kube-system,Attempt:0,}" Dec 16 12:18:20.252248 kubelet[2932]: E1216 12:18:20.252120 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-xvgpm" podUID="f42d8c07-697c-492a-826b-630e49b3282d" Dec 16 12:18:20.256426 kubelet[2932]: E1216 12:18:20.256370 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-khwzf" podUID="c36f8480-80e5-4b1d-b264-0fc54133bb26" Dec 16 12:18:20.264657 systemd-networkd[1583]: cali1cd90a63950: Link UP Dec 16 12:18:20.265585 systemd-networkd[1583]: cali1cd90a63950: Gained carrier Dec 16 12:18:20.293380 containerd[1662]: 2025-12-16 12:18:20.169 [INFO][4646] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--5--b12717c6ea-k8s-csi--node--driver--tp52x-eth0 csi-node-driver- calico-system 1d6077b5-49a5-4d21-bd6b-0ffae41c4da0 749 0 2025-12-16 12:17:54 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:9d99788f7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4547-0-0-5-b12717c6ea csi-node-driver-tp52x eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali1cd90a63950 [] [] }} ContainerID="1c4164466e3bef7baa83800328276ac904ffda7680ad723ca705c64851bbed77" Namespace="calico-system" Pod="csi-node-driver-tp52x" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-csi--node--driver--tp52x-" Dec 16 12:18:20.293380 containerd[1662]: 2025-12-16 12:18:20.169 [INFO][4646] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1c4164466e3bef7baa83800328276ac904ffda7680ad723ca705c64851bbed77" Namespace="calico-system" Pod="csi-node-driver-tp52x" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-csi--node--driver--tp52x-eth0" Dec 16 12:18:20.293380 containerd[1662]: 2025-12-16 12:18:20.206 [INFO][4674] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1c4164466e3bef7baa83800328276ac904ffda7680ad723ca705c64851bbed77" HandleID="k8s-pod-network.1c4164466e3bef7baa83800328276ac904ffda7680ad723ca705c64851bbed77" Workload="ci--4547--0--0--5--b12717c6ea-k8s-csi--node--driver--tp52x-eth0" Dec 16 12:18:20.293915 containerd[1662]: 2025-12-16 12:18:20.206 [INFO][4674] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="1c4164466e3bef7baa83800328276ac904ffda7680ad723ca705c64851bbed77" HandleID="k8s-pod-network.1c4164466e3bef7baa83800328276ac904ffda7680ad723ca705c64851bbed77" Workload="ci--4547--0--0--5--b12717c6ea-k8s-csi--node--driver--tp52x-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40004384e0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-5-b12717c6ea", "pod":"csi-node-driver-tp52x", "timestamp":"2025-12-16 12:18:20.20607778 +0000 UTC"}, Hostname:"ci-4547-0-0-5-b12717c6ea", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:18:20.293915 containerd[1662]: 2025-12-16 12:18:20.206 [INFO][4674] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:18:20.293915 containerd[1662]: 2025-12-16 12:18:20.206 [INFO][4674] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:18:20.293915 containerd[1662]: 2025-12-16 12:18:20.206 [INFO][4674] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-5-b12717c6ea' Dec 16 12:18:20.293915 containerd[1662]: 2025-12-16 12:18:20.220 [INFO][4674] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1c4164466e3bef7baa83800328276ac904ffda7680ad723ca705c64851bbed77" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:20.293915 containerd[1662]: 2025-12-16 12:18:20.224 [INFO][4674] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:20.293915 containerd[1662]: 2025-12-16 12:18:20.231 [INFO][4674] ipam/ipam.go 511: Trying affinity for 192.168.79.192/26 host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:20.293915 containerd[1662]: 2025-12-16 12:18:20.234 [INFO][4674] ipam/ipam.go 158: Attempting to load block cidr=192.168.79.192/26 host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:20.293915 containerd[1662]: 2025-12-16 12:18:20.240 [INFO][4674] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.79.192/26 host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:20.294384 containerd[1662]: 2025-12-16 12:18:20.240 [INFO][4674] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.79.192/26 handle="k8s-pod-network.1c4164466e3bef7baa83800328276ac904ffda7680ad723ca705c64851bbed77" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:20.294384 containerd[1662]: 2025-12-16 12:18:20.242 [INFO][4674] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.1c4164466e3bef7baa83800328276ac904ffda7680ad723ca705c64851bbed77 Dec 16 12:18:20.294384 containerd[1662]: 2025-12-16 12:18:20.248 [INFO][4674] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.79.192/26 handle="k8s-pod-network.1c4164466e3bef7baa83800328276ac904ffda7680ad723ca705c64851bbed77" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:20.294384 containerd[1662]: 2025-12-16 12:18:20.258 [INFO][4674] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.79.196/26] block=192.168.79.192/26 handle="k8s-pod-network.1c4164466e3bef7baa83800328276ac904ffda7680ad723ca705c64851bbed77" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:20.294384 containerd[1662]: 2025-12-16 12:18:20.258 [INFO][4674] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.79.196/26] handle="k8s-pod-network.1c4164466e3bef7baa83800328276ac904ffda7680ad723ca705c64851bbed77" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:20.294384 containerd[1662]: 2025-12-16 12:18:20.258 [INFO][4674] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:18:20.294384 containerd[1662]: 2025-12-16 12:18:20.258 [INFO][4674] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.79.196/26] IPv6=[] ContainerID="1c4164466e3bef7baa83800328276ac904ffda7680ad723ca705c64851bbed77" HandleID="k8s-pod-network.1c4164466e3bef7baa83800328276ac904ffda7680ad723ca705c64851bbed77" Workload="ci--4547--0--0--5--b12717c6ea-k8s-csi--node--driver--tp52x-eth0" Dec 16 12:18:20.294675 containerd[1662]: 2025-12-16 12:18:20.262 [INFO][4646] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1c4164466e3bef7baa83800328276ac904ffda7680ad723ca705c64851bbed77" Namespace="calico-system" Pod="csi-node-driver-tp52x" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-csi--node--driver--tp52x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--b12717c6ea-k8s-csi--node--driver--tp52x-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"1d6077b5-49a5-4d21-bd6b-0ffae41c4da0", ResourceVersion:"749", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 17, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-b12717c6ea", ContainerID:"", Pod:"csi-node-driver-tp52x", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.79.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali1cd90a63950", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:18:20.294804 containerd[1662]: 2025-12-16 12:18:20.262 [INFO][4646] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.79.196/32] ContainerID="1c4164466e3bef7baa83800328276ac904ffda7680ad723ca705c64851bbed77" Namespace="calico-system" Pod="csi-node-driver-tp52x" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-csi--node--driver--tp52x-eth0" Dec 16 12:18:20.294804 containerd[1662]: 2025-12-16 12:18:20.262 [INFO][4646] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1cd90a63950 ContainerID="1c4164466e3bef7baa83800328276ac904ffda7680ad723ca705c64851bbed77" Namespace="calico-system" Pod="csi-node-driver-tp52x" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-csi--node--driver--tp52x-eth0" Dec 16 12:18:20.294804 containerd[1662]: 2025-12-16 12:18:20.274 [INFO][4646] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1c4164466e3bef7baa83800328276ac904ffda7680ad723ca705c64851bbed77" Namespace="calico-system" Pod="csi-node-driver-tp52x" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-csi--node--driver--tp52x-eth0" Dec 16 12:18:20.295323 containerd[1662]: 2025-12-16 12:18:20.275 [INFO][4646] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1c4164466e3bef7baa83800328276ac904ffda7680ad723ca705c64851bbed77" Namespace="calico-system" Pod="csi-node-driver-tp52x" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-csi--node--driver--tp52x-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--b12717c6ea-k8s-csi--node--driver--tp52x-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"1d6077b5-49a5-4d21-bd6b-0ffae41c4da0", ResourceVersion:"749", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 17, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-b12717c6ea", ContainerID:"1c4164466e3bef7baa83800328276ac904ffda7680ad723ca705c64851bbed77", Pod:"csi-node-driver-tp52x", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.79.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali1cd90a63950", MAC:"7a:3a:0c:51:0b:d1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:18:20.295895 containerd[1662]: 2025-12-16 12:18:20.290 [INFO][4646] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1c4164466e3bef7baa83800328276ac904ffda7680ad723ca705c64851bbed77" Namespace="calico-system" Pod="csi-node-driver-tp52x" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-csi--node--driver--tp52x-eth0" Dec 16 12:18:20.294000 audit[4695]: NETFILTER_CFG table=filter:127 family=2 entries=20 op=nft_register_rule pid=4695 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:18:20.294000 audit[4695]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffc9c528f0 a2=0 a3=1 items=0 ppid=3084 pid=4695 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:20.294000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:18:20.299000 audit[4695]: NETFILTER_CFG table=nat:128 family=2 entries=14 op=nft_register_rule pid=4695 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:18:20.299000 audit[4695]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffc9c528f0 a2=0 a3=1 items=0 ppid=3084 pid=4695 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:20.299000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:18:20.308000 audit[4702]: NETFILTER_CFG table=filter:129 family=2 entries=44 op=nft_register_chain pid=4702 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:18:20.308000 audit[4702]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=21952 a0=3 a1=ffffccf4e380 a2=0 a3=ffffb3551fa8 items=0 ppid=4354 pid=4702 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:20.308000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:18:20.323033 containerd[1662]: time="2025-12-16T12:18:20.322981835Z" level=info msg="connecting to shim 1c4164466e3bef7baa83800328276ac904ffda7680ad723ca705c64851bbed77" address="unix:///run/containerd/s/df7c168959b0982e44bea46dc9aca600532a28271edc7dc17018b56ac5ec2d47" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:18:20.357758 systemd-networkd[1583]: cali110d2928018: Link UP Dec 16 12:18:20.358069 systemd-networkd[1583]: cali110d2928018: Gained carrier Dec 16 12:18:20.358367 systemd[1]: Started cri-containerd-1c4164466e3bef7baa83800328276ac904ffda7680ad723ca705c64851bbed77.scope - libcontainer container 1c4164466e3bef7baa83800328276ac904ffda7680ad723ca705c64851bbed77. Dec 16 12:18:20.375141 containerd[1662]: 2025-12-16 12:18:20.173 [INFO][4658] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--5--b12717c6ea-k8s-coredns--66bc5c9577--x2qrf-eth0 coredns-66bc5c9577- kube-system ba1fed23-827f-4247-97a8-f8bd8998b8f8 848 0 2025-12-16 12:17:37 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4547-0-0-5-b12717c6ea coredns-66bc5c9577-x2qrf eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali110d2928018 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="bad21bab1035ffe5292abdfac91c908a5342e92b728a6e57a02390fe14734dd9" Namespace="kube-system" Pod="coredns-66bc5c9577-x2qrf" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-coredns--66bc5c9577--x2qrf-" Dec 16 12:18:20.375141 containerd[1662]: 2025-12-16 12:18:20.174 [INFO][4658] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bad21bab1035ffe5292abdfac91c908a5342e92b728a6e57a02390fe14734dd9" Namespace="kube-system" Pod="coredns-66bc5c9577-x2qrf" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-coredns--66bc5c9577--x2qrf-eth0" Dec 16 12:18:20.375141 containerd[1662]: 2025-12-16 12:18:20.219 [INFO][4680] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bad21bab1035ffe5292abdfac91c908a5342e92b728a6e57a02390fe14734dd9" HandleID="k8s-pod-network.bad21bab1035ffe5292abdfac91c908a5342e92b728a6e57a02390fe14734dd9" Workload="ci--4547--0--0--5--b12717c6ea-k8s-coredns--66bc5c9577--x2qrf-eth0" Dec 16 12:18:20.375330 containerd[1662]: 2025-12-16 12:18:20.219 [INFO][4680] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="bad21bab1035ffe5292abdfac91c908a5342e92b728a6e57a02390fe14734dd9" HandleID="k8s-pod-network.bad21bab1035ffe5292abdfac91c908a5342e92b728a6e57a02390fe14734dd9" Workload="ci--4547--0--0--5--b12717c6ea-k8s-coredns--66bc5c9577--x2qrf-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c2fd0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4547-0-0-5-b12717c6ea", "pod":"coredns-66bc5c9577-x2qrf", "timestamp":"2025-12-16 12:18:20.219525703 +0000 UTC"}, Hostname:"ci-4547-0-0-5-b12717c6ea", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:18:20.375330 containerd[1662]: 2025-12-16 12:18:20.219 [INFO][4680] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:18:20.375330 containerd[1662]: 2025-12-16 12:18:20.258 [INFO][4680] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:18:20.375330 containerd[1662]: 2025-12-16 12:18:20.259 [INFO][4680] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-5-b12717c6ea' Dec 16 12:18:20.375330 containerd[1662]: 2025-12-16 12:18:20.319 [INFO][4680] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bad21bab1035ffe5292abdfac91c908a5342e92b728a6e57a02390fe14734dd9" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:20.375330 containerd[1662]: 2025-12-16 12:18:20.326 [INFO][4680] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:20.375330 containerd[1662]: 2025-12-16 12:18:20.331 [INFO][4680] ipam/ipam.go 511: Trying affinity for 192.168.79.192/26 host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:20.375330 containerd[1662]: 2025-12-16 12:18:20.336 [INFO][4680] ipam/ipam.go 158: Attempting to load block cidr=192.168.79.192/26 host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:20.375330 containerd[1662]: 2025-12-16 12:18:20.338 [INFO][4680] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.79.192/26 host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:20.375766 containerd[1662]: 2025-12-16 12:18:20.338 [INFO][4680] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.79.192/26 handle="k8s-pod-network.bad21bab1035ffe5292abdfac91c908a5342e92b728a6e57a02390fe14734dd9" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:20.375766 containerd[1662]: 2025-12-16 12:18:20.340 [INFO][4680] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.bad21bab1035ffe5292abdfac91c908a5342e92b728a6e57a02390fe14734dd9 Dec 16 12:18:20.375766 containerd[1662]: 2025-12-16 12:18:20.344 [INFO][4680] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.79.192/26 handle="k8s-pod-network.bad21bab1035ffe5292abdfac91c908a5342e92b728a6e57a02390fe14734dd9" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:20.375766 containerd[1662]: 2025-12-16 12:18:20.353 [INFO][4680] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.79.197/26] block=192.168.79.192/26 handle="k8s-pod-network.bad21bab1035ffe5292abdfac91c908a5342e92b728a6e57a02390fe14734dd9" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:20.375766 containerd[1662]: 2025-12-16 12:18:20.353 [INFO][4680] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.79.197/26] handle="k8s-pod-network.bad21bab1035ffe5292abdfac91c908a5342e92b728a6e57a02390fe14734dd9" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:20.375766 containerd[1662]: 2025-12-16 12:18:20.353 [INFO][4680] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:18:20.375766 containerd[1662]: 2025-12-16 12:18:20.353 [INFO][4680] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.79.197/26] IPv6=[] ContainerID="bad21bab1035ffe5292abdfac91c908a5342e92b728a6e57a02390fe14734dd9" HandleID="k8s-pod-network.bad21bab1035ffe5292abdfac91c908a5342e92b728a6e57a02390fe14734dd9" Workload="ci--4547--0--0--5--b12717c6ea-k8s-coredns--66bc5c9577--x2qrf-eth0" Dec 16 12:18:20.375940 containerd[1662]: 2025-12-16 12:18:20.355 [INFO][4658] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bad21bab1035ffe5292abdfac91c908a5342e92b728a6e57a02390fe14734dd9" Namespace="kube-system" Pod="coredns-66bc5c9577-x2qrf" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-coredns--66bc5c9577--x2qrf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--b12717c6ea-k8s-coredns--66bc5c9577--x2qrf-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"ba1fed23-827f-4247-97a8-f8bd8998b8f8", ResourceVersion:"848", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 17, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-b12717c6ea", ContainerID:"", Pod:"coredns-66bc5c9577-x2qrf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.79.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali110d2928018", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:18:20.375940 containerd[1662]: 2025-12-16 12:18:20.355 [INFO][4658] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.79.197/32] ContainerID="bad21bab1035ffe5292abdfac91c908a5342e92b728a6e57a02390fe14734dd9" Namespace="kube-system" Pod="coredns-66bc5c9577-x2qrf" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-coredns--66bc5c9577--x2qrf-eth0" Dec 16 12:18:20.375940 containerd[1662]: 2025-12-16 12:18:20.355 [INFO][4658] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali110d2928018 ContainerID="bad21bab1035ffe5292abdfac91c908a5342e92b728a6e57a02390fe14734dd9" Namespace="kube-system" Pod="coredns-66bc5c9577-x2qrf" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-coredns--66bc5c9577--x2qrf-eth0" Dec 16 12:18:20.375940 containerd[1662]: 2025-12-16 12:18:20.360 [INFO][4658] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bad21bab1035ffe5292abdfac91c908a5342e92b728a6e57a02390fe14734dd9" Namespace="kube-system" Pod="coredns-66bc5c9577-x2qrf" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-coredns--66bc5c9577--x2qrf-eth0" Dec 16 12:18:20.375940 containerd[1662]: 2025-12-16 12:18:20.361 [INFO][4658] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bad21bab1035ffe5292abdfac91c908a5342e92b728a6e57a02390fe14734dd9" Namespace="kube-system" Pod="coredns-66bc5c9577-x2qrf" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-coredns--66bc5c9577--x2qrf-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--b12717c6ea-k8s-coredns--66bc5c9577--x2qrf-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"ba1fed23-827f-4247-97a8-f8bd8998b8f8", ResourceVersion:"848", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 17, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-b12717c6ea", ContainerID:"bad21bab1035ffe5292abdfac91c908a5342e92b728a6e57a02390fe14734dd9", Pod:"coredns-66bc5c9577-x2qrf", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.79.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali110d2928018", MAC:"9e:23:ed:42:99:f6", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:18:20.376110 containerd[1662]: 2025-12-16 12:18:20.372 [INFO][4658] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bad21bab1035ffe5292abdfac91c908a5342e92b728a6e57a02390fe14734dd9" Namespace="kube-system" Pod="coredns-66bc5c9577-x2qrf" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-coredns--66bc5c9577--x2qrf-eth0" Dec 16 12:18:20.378000 audit: BPF prog-id=221 op=LOAD Dec 16 12:18:20.379000 audit: BPF prog-id=222 op=LOAD Dec 16 12:18:20.379000 audit[4722]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4710 pid=4722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:20.379000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163343136343436366533626566376261613833383030333238323736 Dec 16 12:18:20.379000 audit: BPF prog-id=222 op=UNLOAD Dec 16 12:18:20.379000 audit[4722]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4710 pid=4722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:20.379000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163343136343436366533626566376261613833383030333238323736 Dec 16 12:18:20.380000 audit: BPF prog-id=223 op=LOAD Dec 16 12:18:20.380000 audit[4722]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4710 pid=4722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:20.380000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163343136343436366533626566376261613833383030333238323736 Dec 16 12:18:20.380000 audit: BPF prog-id=224 op=LOAD Dec 16 12:18:20.380000 audit[4722]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4710 pid=4722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:20.380000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163343136343436366533626566376261613833383030333238323736 Dec 16 12:18:20.380000 audit: BPF prog-id=224 op=UNLOAD Dec 16 12:18:20.380000 audit[4722]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4710 pid=4722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:20.380000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163343136343436366533626566376261613833383030333238323736 Dec 16 12:18:20.380000 audit: BPF prog-id=223 op=UNLOAD Dec 16 12:18:20.380000 audit[4722]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4710 pid=4722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:20.380000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163343136343436366533626566376261613833383030333238323736 Dec 16 12:18:20.381000 audit: BPF prog-id=225 op=LOAD Dec 16 12:18:20.381000 audit[4722]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4710 pid=4722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:20.381000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3163343136343436366533626566376261613833383030333238323736 Dec 16 12:18:20.398000 audit[4751]: NETFILTER_CFG table=filter:130 family=2 entries=60 op=nft_register_chain pid=4751 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:18:20.398000 audit[4751]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=28968 a0=3 a1=ffffee7baa30 a2=0 a3=ffff80562fa8 items=0 ppid=4354 pid=4751 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:20.398000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:18:20.402064 containerd[1662]: time="2025-12-16T12:18:20.402033008Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-tp52x,Uid:1d6077b5-49a5-4d21-bd6b-0ffae41c4da0,Namespace:calico-system,Attempt:0,} returns sandbox id \"1c4164466e3bef7baa83800328276ac904ffda7680ad723ca705c64851bbed77\"" Dec 16 12:18:20.403634 containerd[1662]: time="2025-12-16T12:18:20.403611853Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:18:20.404966 systemd-networkd[1583]: calia10707dd317: Gained IPv6LL Dec 16 12:18:20.417064 containerd[1662]: time="2025-12-16T12:18:20.416936856Z" level=info msg="connecting to shim bad21bab1035ffe5292abdfac91c908a5342e92b728a6e57a02390fe14734dd9" address="unix:///run/containerd/s/5bed6da185064522d1f1a92ba1708fea33304952ac748008bd327025e948cc76" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:18:20.442025 systemd[1]: Started cri-containerd-bad21bab1035ffe5292abdfac91c908a5342e92b728a6e57a02390fe14734dd9.scope - libcontainer container bad21bab1035ffe5292abdfac91c908a5342e92b728a6e57a02390fe14734dd9. Dec 16 12:18:20.450000 audit: BPF prog-id=226 op=LOAD Dec 16 12:18:20.450000 audit: BPF prog-id=227 op=LOAD Dec 16 12:18:20.450000 audit[4777]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4767 pid=4777 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:20.450000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6261643231626162313033356666653532393261626466616339316339 Dec 16 12:18:20.450000 audit: BPF prog-id=227 op=UNLOAD Dec 16 12:18:20.450000 audit[4777]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4767 pid=4777 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:20.450000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6261643231626162313033356666653532393261626466616339316339 Dec 16 12:18:20.451000 audit: BPF prog-id=228 op=LOAD Dec 16 12:18:20.451000 audit[4777]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4767 pid=4777 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:20.451000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6261643231626162313033356666653532393261626466616339316339 Dec 16 12:18:20.451000 audit: BPF prog-id=229 op=LOAD Dec 16 12:18:20.451000 audit[4777]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4767 pid=4777 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:20.451000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6261643231626162313033356666653532393261626466616339316339 Dec 16 12:18:20.451000 audit: BPF prog-id=229 op=UNLOAD Dec 16 12:18:20.451000 audit[4777]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4767 pid=4777 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:20.451000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6261643231626162313033356666653532393261626466616339316339 Dec 16 12:18:20.451000 audit: BPF prog-id=228 op=UNLOAD Dec 16 12:18:20.451000 audit[4777]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4767 pid=4777 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:20.451000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6261643231626162313033356666653532393261626466616339316339 Dec 16 12:18:20.451000 audit: BPF prog-id=230 op=LOAD Dec 16 12:18:20.451000 audit[4777]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4767 pid=4777 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:20.451000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6261643231626162313033356666653532393261626466616339316339 Dec 16 12:18:20.474992 containerd[1662]: time="2025-12-16T12:18:20.474950881Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-x2qrf,Uid:ba1fed23-827f-4247-97a8-f8bd8998b8f8,Namespace:kube-system,Attempt:0,} returns sandbox id \"bad21bab1035ffe5292abdfac91c908a5342e92b728a6e57a02390fe14734dd9\"" Dec 16 12:18:20.480071 containerd[1662]: time="2025-12-16T12:18:20.480037178Z" level=info msg="CreateContainer within sandbox \"bad21bab1035ffe5292abdfac91c908a5342e92b728a6e57a02390fe14734dd9\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 16 12:18:20.488093 containerd[1662]: time="2025-12-16T12:18:20.487514642Z" level=info msg="Container c41fb289f3f80fb7a174f36113d75baaf58676ec822498096a47b84d2b9dbc95: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:18:20.494465 containerd[1662]: time="2025-12-16T12:18:20.494433904Z" level=info msg="CreateContainer within sandbox \"bad21bab1035ffe5292abdfac91c908a5342e92b728a6e57a02390fe14734dd9\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"c41fb289f3f80fb7a174f36113d75baaf58676ec822498096a47b84d2b9dbc95\"" Dec 16 12:18:20.494974 containerd[1662]: time="2025-12-16T12:18:20.494946945Z" level=info msg="StartContainer for \"c41fb289f3f80fb7a174f36113d75baaf58676ec822498096a47b84d2b9dbc95\"" Dec 16 12:18:20.495838 containerd[1662]: time="2025-12-16T12:18:20.495795308Z" level=info msg="connecting to shim c41fb289f3f80fb7a174f36113d75baaf58676ec822498096a47b84d2b9dbc95" address="unix:///run/containerd/s/5bed6da185064522d1f1a92ba1708fea33304952ac748008bd327025e948cc76" protocol=ttrpc version=3 Dec 16 12:18:20.515071 systemd[1]: Started cri-containerd-c41fb289f3f80fb7a174f36113d75baaf58676ec822498096a47b84d2b9dbc95.scope - libcontainer container c41fb289f3f80fb7a174f36113d75baaf58676ec822498096a47b84d2b9dbc95. Dec 16 12:18:20.524000 audit: BPF prog-id=231 op=LOAD Dec 16 12:18:20.524000 audit: BPF prog-id=232 op=LOAD Dec 16 12:18:20.524000 audit[4804]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4767 pid=4804 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:20.524000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334316662323839663366383066623761313734663336313133643735 Dec 16 12:18:20.524000 audit: BPF prog-id=232 op=UNLOAD Dec 16 12:18:20.524000 audit[4804]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4767 pid=4804 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:20.524000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334316662323839663366383066623761313734663336313133643735 Dec 16 12:18:20.524000 audit: BPF prog-id=233 op=LOAD Dec 16 12:18:20.524000 audit[4804]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4767 pid=4804 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:20.524000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334316662323839663366383066623761313734663336313133643735 Dec 16 12:18:20.524000 audit: BPF prog-id=234 op=LOAD Dec 16 12:18:20.524000 audit[4804]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4767 pid=4804 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:20.524000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334316662323839663366383066623761313734663336313133643735 Dec 16 12:18:20.524000 audit: BPF prog-id=234 op=UNLOAD Dec 16 12:18:20.524000 audit[4804]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4767 pid=4804 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:20.524000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334316662323839663366383066623761313734663336313133643735 Dec 16 12:18:20.524000 audit: BPF prog-id=233 op=UNLOAD Dec 16 12:18:20.524000 audit[4804]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4767 pid=4804 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:20.524000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334316662323839663366383066623761313734663336313133643735 Dec 16 12:18:20.524000 audit: BPF prog-id=235 op=LOAD Dec 16 12:18:20.524000 audit[4804]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4767 pid=4804 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:20.524000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334316662323839663366383066623761313734663336313133643735 Dec 16 12:18:20.542850 containerd[1662]: time="2025-12-16T12:18:20.542781459Z" level=info msg="StartContainer for \"c41fb289f3f80fb7a174f36113d75baaf58676ec822498096a47b84d2b9dbc95\" returns successfully" Dec 16 12:18:20.780575 containerd[1662]: time="2025-12-16T12:18:20.780498900Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:18:20.782353 containerd[1662]: time="2025-12-16T12:18:20.782306506Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:18:20.782451 containerd[1662]: time="2025-12-16T12:18:20.782393826Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:18:20.782587 kubelet[2932]: E1216 12:18:20.782555 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:18:20.782952 kubelet[2932]: E1216 12:18:20.782598 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:18:20.782952 kubelet[2932]: E1216 12:18:20.782672 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-tp52x_calico-system(1d6077b5-49a5-4d21-bd6b-0ffae41c4da0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:18:20.784019 containerd[1662]: time="2025-12-16T12:18:20.783994791Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:18:20.917033 systemd-networkd[1583]: cali4e4a1e5dc9d: Gained IPv6LL Dec 16 12:18:21.122872 containerd[1662]: time="2025-12-16T12:18:21.122377155Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:18:21.124729 containerd[1662]: time="2025-12-16T12:18:21.124670122Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:18:21.124795 containerd[1662]: time="2025-12-16T12:18:21.124685522Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:18:21.125136 kubelet[2932]: E1216 12:18:21.124942 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:18:21.125136 kubelet[2932]: E1216 12:18:21.125000 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:18:21.125136 kubelet[2932]: E1216 12:18:21.125081 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-tp52x_calico-system(1d6077b5-49a5-4d21-bd6b-0ffae41c4da0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:18:21.125300 kubelet[2932]: E1216 12:18:21.125116 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-tp52x" podUID="1d6077b5-49a5-4d21-bd6b-0ffae41c4da0" Dec 16 12:18:21.260605 kubelet[2932]: E1216 12:18:21.260274 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-khwzf" podUID="c36f8480-80e5-4b1d-b264-0fc54133bb26" Dec 16 12:18:21.260605 kubelet[2932]: E1216 12:18:21.260414 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-xvgpm" podUID="f42d8c07-697c-492a-826b-630e49b3282d" Dec 16 12:18:21.261473 kubelet[2932]: E1216 12:18:21.261413 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-tp52x" podUID="1d6077b5-49a5-4d21-bd6b-0ffae41c4da0" Dec 16 12:18:21.272598 kubelet[2932]: I1216 12:18:21.272527 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-x2qrf" podStartSLOduration=44.272508836 podStartE2EDuration="44.272508836s" podCreationTimestamp="2025-12-16 12:17:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:18:21.271676033 +0000 UTC m=+49.322591144" watchObservedRunningTime="2025-12-16 12:18:21.272508836 +0000 UTC m=+49.323423947" Dec 16 12:18:21.314000 audit[4846]: NETFILTER_CFG table=filter:131 family=2 entries=20 op=nft_register_rule pid=4846 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:18:21.314000 audit[4846]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=fffff9051a20 a2=0 a3=1 items=0 ppid=3084 pid=4846 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:21.314000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:18:21.321000 audit[4846]: NETFILTER_CFG table=nat:132 family=2 entries=14 op=nft_register_rule pid=4846 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:18:21.321000 audit[4846]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=fffff9051a20 a2=0 a3=1 items=0 ppid=3084 pid=4846 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:21.321000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:18:21.493090 systemd-networkd[1583]: cali1cd90a63950: Gained IPv6LL Dec 16 12:18:21.685164 systemd-networkd[1583]: cali110d2928018: Gained IPv6LL Dec 16 12:18:22.114490 containerd[1662]: time="2025-12-16T12:18:22.114450493Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7db869f9-8vr6z,Uid:01d79110-1504-420e-beb4-8a0f58f3b458,Namespace:calico-system,Attempt:0,}" Dec 16 12:18:22.116241 containerd[1662]: time="2025-12-16T12:18:22.116209178Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-hvtrp,Uid:7d93cb67-a590-4cec-b00c-d1ee5673d5a2,Namespace:kube-system,Attempt:0,}" Dec 16 12:18:22.117529 containerd[1662]: time="2025-12-16T12:18:22.117494742Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d558c8677-8zlvg,Uid:963a1289-7bdf-4dde-a21a-b1e5da9ecfcc,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:18:22.252555 systemd-networkd[1583]: cali3389b4ad3c8: Link UP Dec 16 12:18:22.255275 systemd-networkd[1583]: cali3389b4ad3c8: Gained carrier Dec 16 12:18:22.267332 kubelet[2932]: E1216 12:18:22.267278 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-tp52x" podUID="1d6077b5-49a5-4d21-bd6b-0ffae41c4da0" Dec 16 12:18:22.276506 containerd[1662]: 2025-12-16 12:18:22.176 [INFO][4859] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--5--b12717c6ea-k8s-coredns--66bc5c9577--hvtrp-eth0 coredns-66bc5c9577- kube-system 7d93cb67-a590-4cec-b00c-d1ee5673d5a2 850 0 2025-12-16 12:17:37 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4547-0-0-5-b12717c6ea coredns-66bc5c9577-hvtrp eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali3389b4ad3c8 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="b41075bec1bcca4c19afed6409faf97104c6ac909d314b41df9fb87cec1ce57e" Namespace="kube-system" Pod="coredns-66bc5c9577-hvtrp" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-coredns--66bc5c9577--hvtrp-" Dec 16 12:18:22.276506 containerd[1662]: 2025-12-16 12:18:22.176 [INFO][4859] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="b41075bec1bcca4c19afed6409faf97104c6ac909d314b41df9fb87cec1ce57e" Namespace="kube-system" Pod="coredns-66bc5c9577-hvtrp" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-coredns--66bc5c9577--hvtrp-eth0" Dec 16 12:18:22.276506 containerd[1662]: 2025-12-16 12:18:22.206 [INFO][4899] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="b41075bec1bcca4c19afed6409faf97104c6ac909d314b41df9fb87cec1ce57e" HandleID="k8s-pod-network.b41075bec1bcca4c19afed6409faf97104c6ac909d314b41df9fb87cec1ce57e" Workload="ci--4547--0--0--5--b12717c6ea-k8s-coredns--66bc5c9577--hvtrp-eth0" Dec 16 12:18:22.276506 containerd[1662]: 2025-12-16 12:18:22.206 [INFO][4899] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="b41075bec1bcca4c19afed6409faf97104c6ac909d314b41df9fb87cec1ce57e" HandleID="k8s-pod-network.b41075bec1bcca4c19afed6409faf97104c6ac909d314b41df9fb87cec1ce57e" Workload="ci--4547--0--0--5--b12717c6ea-k8s-coredns--66bc5c9577--hvtrp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400034b5c0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4547-0-0-5-b12717c6ea", "pod":"coredns-66bc5c9577-hvtrp", "timestamp":"2025-12-16 12:18:22.206468107 +0000 UTC"}, Hostname:"ci-4547-0-0-5-b12717c6ea", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:18:22.276506 containerd[1662]: 2025-12-16 12:18:22.206 [INFO][4899] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:18:22.276506 containerd[1662]: 2025-12-16 12:18:22.206 [INFO][4899] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:18:22.276506 containerd[1662]: 2025-12-16 12:18:22.206 [INFO][4899] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-5-b12717c6ea' Dec 16 12:18:22.276506 containerd[1662]: 2025-12-16 12:18:22.217 [INFO][4899] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.b41075bec1bcca4c19afed6409faf97104c6ac909d314b41df9fb87cec1ce57e" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:22.276506 containerd[1662]: 2025-12-16 12:18:22.223 [INFO][4899] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:22.276506 containerd[1662]: 2025-12-16 12:18:22.228 [INFO][4899] ipam/ipam.go 511: Trying affinity for 192.168.79.192/26 host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:22.276506 containerd[1662]: 2025-12-16 12:18:22.230 [INFO][4899] ipam/ipam.go 158: Attempting to load block cidr=192.168.79.192/26 host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:22.276506 containerd[1662]: 2025-12-16 12:18:22.232 [INFO][4899] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.79.192/26 host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:22.276506 containerd[1662]: 2025-12-16 12:18:22.233 [INFO][4899] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.79.192/26 handle="k8s-pod-network.b41075bec1bcca4c19afed6409faf97104c6ac909d314b41df9fb87cec1ce57e" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:22.276506 containerd[1662]: 2025-12-16 12:18:22.235 [INFO][4899] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.b41075bec1bcca4c19afed6409faf97104c6ac909d314b41df9fb87cec1ce57e Dec 16 12:18:22.276506 containerd[1662]: 2025-12-16 12:18:22.239 [INFO][4899] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.79.192/26 handle="k8s-pod-network.b41075bec1bcca4c19afed6409faf97104c6ac909d314b41df9fb87cec1ce57e" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:22.276506 containerd[1662]: 2025-12-16 12:18:22.244 [INFO][4899] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.79.198/26] block=192.168.79.192/26 handle="k8s-pod-network.b41075bec1bcca4c19afed6409faf97104c6ac909d314b41df9fb87cec1ce57e" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:22.276506 containerd[1662]: 2025-12-16 12:18:22.244 [INFO][4899] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.79.198/26] handle="k8s-pod-network.b41075bec1bcca4c19afed6409faf97104c6ac909d314b41df9fb87cec1ce57e" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:22.276506 containerd[1662]: 2025-12-16 12:18:22.244 [INFO][4899] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:18:22.276506 containerd[1662]: 2025-12-16 12:18:22.245 [INFO][4899] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.79.198/26] IPv6=[] ContainerID="b41075bec1bcca4c19afed6409faf97104c6ac909d314b41df9fb87cec1ce57e" HandleID="k8s-pod-network.b41075bec1bcca4c19afed6409faf97104c6ac909d314b41df9fb87cec1ce57e" Workload="ci--4547--0--0--5--b12717c6ea-k8s-coredns--66bc5c9577--hvtrp-eth0" Dec 16 12:18:22.277721 containerd[1662]: 2025-12-16 12:18:22.248 [INFO][4859] cni-plugin/k8s.go 418: Populated endpoint ContainerID="b41075bec1bcca4c19afed6409faf97104c6ac909d314b41df9fb87cec1ce57e" Namespace="kube-system" Pod="coredns-66bc5c9577-hvtrp" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-coredns--66bc5c9577--hvtrp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--b12717c6ea-k8s-coredns--66bc5c9577--hvtrp-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"7d93cb67-a590-4cec-b00c-d1ee5673d5a2", ResourceVersion:"850", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 17, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-b12717c6ea", ContainerID:"", Pod:"coredns-66bc5c9577-hvtrp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.79.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3389b4ad3c8", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:18:22.277721 containerd[1662]: 2025-12-16 12:18:22.248 [INFO][4859] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.79.198/32] ContainerID="b41075bec1bcca4c19afed6409faf97104c6ac909d314b41df9fb87cec1ce57e" Namespace="kube-system" Pod="coredns-66bc5c9577-hvtrp" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-coredns--66bc5c9577--hvtrp-eth0" Dec 16 12:18:22.277721 containerd[1662]: 2025-12-16 12:18:22.248 [INFO][4859] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali3389b4ad3c8 ContainerID="b41075bec1bcca4c19afed6409faf97104c6ac909d314b41df9fb87cec1ce57e" Namespace="kube-system" Pod="coredns-66bc5c9577-hvtrp" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-coredns--66bc5c9577--hvtrp-eth0" Dec 16 12:18:22.277721 containerd[1662]: 2025-12-16 12:18:22.255 [INFO][4859] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="b41075bec1bcca4c19afed6409faf97104c6ac909d314b41df9fb87cec1ce57e" Namespace="kube-system" Pod="coredns-66bc5c9577-hvtrp" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-coredns--66bc5c9577--hvtrp-eth0" Dec 16 12:18:22.277721 containerd[1662]: 2025-12-16 12:18:22.256 [INFO][4859] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="b41075bec1bcca4c19afed6409faf97104c6ac909d314b41df9fb87cec1ce57e" Namespace="kube-system" Pod="coredns-66bc5c9577-hvtrp" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-coredns--66bc5c9577--hvtrp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--b12717c6ea-k8s-coredns--66bc5c9577--hvtrp-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"7d93cb67-a590-4cec-b00c-d1ee5673d5a2", ResourceVersion:"850", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 17, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-b12717c6ea", ContainerID:"b41075bec1bcca4c19afed6409faf97104c6ac909d314b41df9fb87cec1ce57e", Pod:"coredns-66bc5c9577-hvtrp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.79.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali3389b4ad3c8", MAC:"82:38:78:cd:67:02", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:18:22.278984 containerd[1662]: 2025-12-16 12:18:22.271 [INFO][4859] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="b41075bec1bcca4c19afed6409faf97104c6ac909d314b41df9fb87cec1ce57e" Namespace="kube-system" Pod="coredns-66bc5c9577-hvtrp" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-coredns--66bc5c9577--hvtrp-eth0" Dec 16 12:18:22.290000 audit[4926]: NETFILTER_CFG table=filter:133 family=2 entries=44 op=nft_register_chain pid=4926 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:18:22.290000 audit[4926]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=21516 a0=3 a1=ffffd58682e0 a2=0 a3=ffffb69b5fa8 items=0 ppid=4354 pid=4926 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.290000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:18:22.309494 containerd[1662]: time="2025-12-16T12:18:22.309452717Z" level=info msg="connecting to shim b41075bec1bcca4c19afed6409faf97104c6ac909d314b41df9fb87cec1ce57e" address="unix:///run/containerd/s/3aa6128746ffc7c8008fb97e20bf1604164ef77daf8a274d210164bc30765d32" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:18:22.344377 systemd[1]: Started cri-containerd-b41075bec1bcca4c19afed6409faf97104c6ac909d314b41df9fb87cec1ce57e.scope - libcontainer container b41075bec1bcca4c19afed6409faf97104c6ac909d314b41df9fb87cec1ce57e. Dec 16 12:18:22.347000 audit[4960]: NETFILTER_CFG table=filter:134 family=2 entries=17 op=nft_register_rule pid=4960 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:18:22.347000 audit[4960]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=fffff71538b0 a2=0 a3=1 items=0 ppid=3084 pid=4960 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.347000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:18:22.357441 systemd-networkd[1583]: cali8003fa465d9: Link UP Dec 16 12:18:22.358874 systemd-networkd[1583]: cali8003fa465d9: Gained carrier Dec 16 12:18:22.357000 audit[4960]: NETFILTER_CFG table=nat:135 family=2 entries=35 op=nft_register_chain pid=4960 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:18:22.357000 audit[4960]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14196 a0=3 a1=fffff71538b0 a2=0 a3=1 items=0 ppid=3084 pid=4960 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.357000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:18:22.364000 audit: BPF prog-id=236 op=LOAD Dec 16 12:18:22.366000 audit: BPF prog-id=237 op=LOAD Dec 16 12:18:22.366000 audit[4946]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4935 pid=4946 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.366000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234313037356265633162636361346331396166656436343039666166 Dec 16 12:18:22.366000 audit: BPF prog-id=237 op=UNLOAD Dec 16 12:18:22.366000 audit[4946]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4935 pid=4946 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.366000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234313037356265633162636361346331396166656436343039666166 Dec 16 12:18:22.366000 audit: BPF prog-id=238 op=LOAD Dec 16 12:18:22.366000 audit[4946]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4935 pid=4946 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.366000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234313037356265633162636361346331396166656436343039666166 Dec 16 12:18:22.366000 audit: BPF prog-id=239 op=LOAD Dec 16 12:18:22.366000 audit[4946]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4935 pid=4946 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.366000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234313037356265633162636361346331396166656436343039666166 Dec 16 12:18:22.367000 audit: BPF prog-id=239 op=UNLOAD Dec 16 12:18:22.367000 audit[4946]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4935 pid=4946 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.367000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234313037356265633162636361346331396166656436343039666166 Dec 16 12:18:22.367000 audit: BPF prog-id=238 op=UNLOAD Dec 16 12:18:22.367000 audit[4946]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4935 pid=4946 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.367000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234313037356265633162636361346331396166656436343039666166 Dec 16 12:18:22.367000 audit: BPF prog-id=240 op=LOAD Dec 16 12:18:22.367000 audit[4946]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4935 pid=4946 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.367000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6234313037356265633162636361346331396166656436343039666166 Dec 16 12:18:22.373248 containerd[1662]: 2025-12-16 12:18:22.169 [INFO][4848] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--5--b12717c6ea-k8s-calico--kube--controllers--7db869f9--8vr6z-eth0 calico-kube-controllers-7db869f9- calico-system 01d79110-1504-420e-beb4-8a0f58f3b458 847 0 2025-12-16 12:17:54 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:7db869f9 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4547-0-0-5-b12717c6ea calico-kube-controllers-7db869f9-8vr6z eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali8003fa465d9 [] [] }} ContainerID="7717fc86834e738d1ffaee4d16ff092e0f7c30bdf5f2ea86abf8d1a57bb25729" Namespace="calico-system" Pod="calico-kube-controllers-7db869f9-8vr6z" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-calico--kube--controllers--7db869f9--8vr6z-" Dec 16 12:18:22.373248 containerd[1662]: 2025-12-16 12:18:22.169 [INFO][4848] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="7717fc86834e738d1ffaee4d16ff092e0f7c30bdf5f2ea86abf8d1a57bb25729" Namespace="calico-system" Pod="calico-kube-controllers-7db869f9-8vr6z" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-calico--kube--controllers--7db869f9--8vr6z-eth0" Dec 16 12:18:22.373248 containerd[1662]: 2025-12-16 12:18:22.209 [INFO][4893] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="7717fc86834e738d1ffaee4d16ff092e0f7c30bdf5f2ea86abf8d1a57bb25729" HandleID="k8s-pod-network.7717fc86834e738d1ffaee4d16ff092e0f7c30bdf5f2ea86abf8d1a57bb25729" Workload="ci--4547--0--0--5--b12717c6ea-k8s-calico--kube--controllers--7db869f9--8vr6z-eth0" Dec 16 12:18:22.373248 containerd[1662]: 2025-12-16 12:18:22.209 [INFO][4893] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="7717fc86834e738d1ffaee4d16ff092e0f7c30bdf5f2ea86abf8d1a57bb25729" HandleID="k8s-pod-network.7717fc86834e738d1ffaee4d16ff092e0f7c30bdf5f2ea86abf8d1a57bb25729" Workload="ci--4547--0--0--5--b12717c6ea-k8s-calico--kube--controllers--7db869f9--8vr6z-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004d6e0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-5-b12717c6ea", "pod":"calico-kube-controllers-7db869f9-8vr6z", "timestamp":"2025-12-16 12:18:22.209080396 +0000 UTC"}, Hostname:"ci-4547-0-0-5-b12717c6ea", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:18:22.373248 containerd[1662]: 2025-12-16 12:18:22.209 [INFO][4893] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:18:22.373248 containerd[1662]: 2025-12-16 12:18:22.245 [INFO][4893] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:18:22.373248 containerd[1662]: 2025-12-16 12:18:22.245 [INFO][4893] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-5-b12717c6ea' Dec 16 12:18:22.373248 containerd[1662]: 2025-12-16 12:18:22.320 [INFO][4893] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.7717fc86834e738d1ffaee4d16ff092e0f7c30bdf5f2ea86abf8d1a57bb25729" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:22.373248 containerd[1662]: 2025-12-16 12:18:22.326 [INFO][4893] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:22.373248 containerd[1662]: 2025-12-16 12:18:22.331 [INFO][4893] ipam/ipam.go 511: Trying affinity for 192.168.79.192/26 host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:22.373248 containerd[1662]: 2025-12-16 12:18:22.334 [INFO][4893] ipam/ipam.go 158: Attempting to load block cidr=192.168.79.192/26 host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:22.373248 containerd[1662]: 2025-12-16 12:18:22.336 [INFO][4893] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.79.192/26 host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:22.373248 containerd[1662]: 2025-12-16 12:18:22.336 [INFO][4893] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.79.192/26 handle="k8s-pod-network.7717fc86834e738d1ffaee4d16ff092e0f7c30bdf5f2ea86abf8d1a57bb25729" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:22.373248 containerd[1662]: 2025-12-16 12:18:22.338 [INFO][4893] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.7717fc86834e738d1ffaee4d16ff092e0f7c30bdf5f2ea86abf8d1a57bb25729 Dec 16 12:18:22.373248 containerd[1662]: 2025-12-16 12:18:22.341 [INFO][4893] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.79.192/26 handle="k8s-pod-network.7717fc86834e738d1ffaee4d16ff092e0f7c30bdf5f2ea86abf8d1a57bb25729" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:22.373248 containerd[1662]: 2025-12-16 12:18:22.350 [INFO][4893] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.79.199/26] block=192.168.79.192/26 handle="k8s-pod-network.7717fc86834e738d1ffaee4d16ff092e0f7c30bdf5f2ea86abf8d1a57bb25729" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:22.373248 containerd[1662]: 2025-12-16 12:18:22.350 [INFO][4893] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.79.199/26] handle="k8s-pod-network.7717fc86834e738d1ffaee4d16ff092e0f7c30bdf5f2ea86abf8d1a57bb25729" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:22.373248 containerd[1662]: 2025-12-16 12:18:22.350 [INFO][4893] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:18:22.373248 containerd[1662]: 2025-12-16 12:18:22.350 [INFO][4893] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.79.199/26] IPv6=[] ContainerID="7717fc86834e738d1ffaee4d16ff092e0f7c30bdf5f2ea86abf8d1a57bb25729" HandleID="k8s-pod-network.7717fc86834e738d1ffaee4d16ff092e0f7c30bdf5f2ea86abf8d1a57bb25729" Workload="ci--4547--0--0--5--b12717c6ea-k8s-calico--kube--controllers--7db869f9--8vr6z-eth0" Dec 16 12:18:22.373705 containerd[1662]: 2025-12-16 12:18:22.353 [INFO][4848] cni-plugin/k8s.go 418: Populated endpoint ContainerID="7717fc86834e738d1ffaee4d16ff092e0f7c30bdf5f2ea86abf8d1a57bb25729" Namespace="calico-system" Pod="calico-kube-controllers-7db869f9-8vr6z" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-calico--kube--controllers--7db869f9--8vr6z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--b12717c6ea-k8s-calico--kube--controllers--7db869f9--8vr6z-eth0", GenerateName:"calico-kube-controllers-7db869f9-", Namespace:"calico-system", SelfLink:"", UID:"01d79110-1504-420e-beb4-8a0f58f3b458", ResourceVersion:"847", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 17, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7db869f9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-b12717c6ea", ContainerID:"", Pod:"calico-kube-controllers-7db869f9-8vr6z", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.79.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali8003fa465d9", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:18:22.373705 containerd[1662]: 2025-12-16 12:18:22.353 [INFO][4848] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.79.199/32] ContainerID="7717fc86834e738d1ffaee4d16ff092e0f7c30bdf5f2ea86abf8d1a57bb25729" Namespace="calico-system" Pod="calico-kube-controllers-7db869f9-8vr6z" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-calico--kube--controllers--7db869f9--8vr6z-eth0" Dec 16 12:18:22.373705 containerd[1662]: 2025-12-16 12:18:22.353 [INFO][4848] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8003fa465d9 ContainerID="7717fc86834e738d1ffaee4d16ff092e0f7c30bdf5f2ea86abf8d1a57bb25729" Namespace="calico-system" Pod="calico-kube-controllers-7db869f9-8vr6z" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-calico--kube--controllers--7db869f9--8vr6z-eth0" Dec 16 12:18:22.373705 containerd[1662]: 2025-12-16 12:18:22.358 [INFO][4848] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="7717fc86834e738d1ffaee4d16ff092e0f7c30bdf5f2ea86abf8d1a57bb25729" Namespace="calico-system" Pod="calico-kube-controllers-7db869f9-8vr6z" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-calico--kube--controllers--7db869f9--8vr6z-eth0" Dec 16 12:18:22.373705 containerd[1662]: 2025-12-16 12:18:22.359 [INFO][4848] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="7717fc86834e738d1ffaee4d16ff092e0f7c30bdf5f2ea86abf8d1a57bb25729" Namespace="calico-system" Pod="calico-kube-controllers-7db869f9-8vr6z" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-calico--kube--controllers--7db869f9--8vr6z-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--b12717c6ea-k8s-calico--kube--controllers--7db869f9--8vr6z-eth0", GenerateName:"calico-kube-controllers-7db869f9-", Namespace:"calico-system", SelfLink:"", UID:"01d79110-1504-420e-beb4-8a0f58f3b458", ResourceVersion:"847", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 17, 54, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"7db869f9", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-b12717c6ea", ContainerID:"7717fc86834e738d1ffaee4d16ff092e0f7c30bdf5f2ea86abf8d1a57bb25729", Pod:"calico-kube-controllers-7db869f9-8vr6z", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.79.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali8003fa465d9", MAC:"9e:4a:5f:70:e6:3f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:18:22.373705 containerd[1662]: 2025-12-16 12:18:22.369 [INFO][4848] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="7717fc86834e738d1ffaee4d16ff092e0f7c30bdf5f2ea86abf8d1a57bb25729" Namespace="calico-system" Pod="calico-kube-controllers-7db869f9-8vr6z" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-calico--kube--controllers--7db869f9--8vr6z-eth0" Dec 16 12:18:22.392000 audit[4978]: NETFILTER_CFG table=filter:136 family=2 entries=52 op=nft_register_chain pid=4978 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:18:22.392000 audit[4978]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=24312 a0=3 a1=ffffc3348520 a2=0 a3=ffff8dbe8fa8 items=0 ppid=4354 pid=4978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.392000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:18:22.402445 containerd[1662]: time="2025-12-16T12:18:22.402395815Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-hvtrp,Uid:7d93cb67-a590-4cec-b00c-d1ee5673d5a2,Namespace:kube-system,Attempt:0,} returns sandbox id \"b41075bec1bcca4c19afed6409faf97104c6ac909d314b41df9fb87cec1ce57e\"" Dec 16 12:18:22.407592 containerd[1662]: time="2025-12-16T12:18:22.406992950Z" level=info msg="connecting to shim 7717fc86834e738d1ffaee4d16ff092e0f7c30bdf5f2ea86abf8d1a57bb25729" address="unix:///run/containerd/s/beb2eb8513538f353ce3e67c88e6eacd9d5148e1d00e62caff879e93ae7402b6" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:18:22.408076 containerd[1662]: time="2025-12-16T12:18:22.407999633Z" level=info msg="CreateContainer within sandbox \"b41075bec1bcca4c19afed6409faf97104c6ac909d314b41df9fb87cec1ce57e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 16 12:18:22.424672 containerd[1662]: time="2025-12-16T12:18:22.424612446Z" level=info msg="Container 9947413497d6a5d35b1085513926f490c16a5b78fc1e499cab26e3e50787b6ef: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:18:22.434415 containerd[1662]: time="2025-12-16T12:18:22.434358437Z" level=info msg="CreateContainer within sandbox \"b41075bec1bcca4c19afed6409faf97104c6ac909d314b41df9fb87cec1ce57e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"9947413497d6a5d35b1085513926f490c16a5b78fc1e499cab26e3e50787b6ef\"" Dec 16 12:18:22.435152 containerd[1662]: time="2025-12-16T12:18:22.435121440Z" level=info msg="StartContainer for \"9947413497d6a5d35b1085513926f490c16a5b78fc1e499cab26e3e50787b6ef\"" Dec 16 12:18:22.436411 containerd[1662]: time="2025-12-16T12:18:22.436385004Z" level=info msg="connecting to shim 9947413497d6a5d35b1085513926f490c16a5b78fc1e499cab26e3e50787b6ef" address="unix:///run/containerd/s/3aa6128746ffc7c8008fb97e20bf1604164ef77daf8a274d210164bc30765d32" protocol=ttrpc version=3 Dec 16 12:18:22.442407 systemd[1]: Started cri-containerd-7717fc86834e738d1ffaee4d16ff092e0f7c30bdf5f2ea86abf8d1a57bb25729.scope - libcontainer container 7717fc86834e738d1ffaee4d16ff092e0f7c30bdf5f2ea86abf8d1a57bb25729. Dec 16 12:18:22.462689 systemd[1]: Started cri-containerd-9947413497d6a5d35b1085513926f490c16a5b78fc1e499cab26e3e50787b6ef.scope - libcontainer container 9947413497d6a5d35b1085513926f490c16a5b78fc1e499cab26e3e50787b6ef. Dec 16 12:18:22.464000 audit: BPF prog-id=241 op=LOAD Dec 16 12:18:22.466000 audit: BPF prog-id=242 op=LOAD Dec 16 12:18:22.466000 audit[5005]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=4994 pid=5005 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.466000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3737313766633836383334653733386431666661656534643136666630 Dec 16 12:18:22.466000 audit: BPF prog-id=242 op=UNLOAD Dec 16 12:18:22.466000 audit[5005]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4994 pid=5005 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.466000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3737313766633836383334653733386431666661656534643136666630 Dec 16 12:18:22.468000 audit: BPF prog-id=243 op=LOAD Dec 16 12:18:22.468000 audit[5005]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=4994 pid=5005 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.468000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3737313766633836383334653733386431666661656534643136666630 Dec 16 12:18:22.468000 audit: BPF prog-id=244 op=LOAD Dec 16 12:18:22.468000 audit[5005]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=4994 pid=5005 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.468000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3737313766633836383334653733386431666661656534643136666630 Dec 16 12:18:22.469000 audit: BPF prog-id=244 op=UNLOAD Dec 16 12:18:22.469000 audit[5005]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4994 pid=5005 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.469000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3737313766633836383334653733386431666661656534643136666630 Dec 16 12:18:22.469000 audit: BPF prog-id=243 op=UNLOAD Dec 16 12:18:22.470946 systemd-networkd[1583]: cali71aede9d8c4: Link UP Dec 16 12:18:22.471601 systemd-networkd[1583]: cali71aede9d8c4: Gained carrier Dec 16 12:18:22.469000 audit[5005]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4994 pid=5005 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.469000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3737313766633836383334653733386431666661656534643136666630 Dec 16 12:18:22.472000 audit: BPF prog-id=245 op=LOAD Dec 16 12:18:22.472000 audit[5005]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=4994 pid=5005 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.472000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3737313766633836383334653733386431666661656534643136666630 Dec 16 12:18:22.480000 audit: BPF prog-id=246 op=LOAD Dec 16 12:18:22.481000 audit: BPF prog-id=247 op=LOAD Dec 16 12:18:22.481000 audit[5018]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4935 pid=5018 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939343734313334393764366135643335623130383535313339323666 Dec 16 12:18:22.481000 audit: BPF prog-id=247 op=UNLOAD Dec 16 12:18:22.481000 audit[5018]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4935 pid=5018 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939343734313334393764366135643335623130383535313339323666 Dec 16 12:18:22.481000 audit: BPF prog-id=248 op=LOAD Dec 16 12:18:22.481000 audit[5018]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4935 pid=5018 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939343734313334393764366135643335623130383535313339323666 Dec 16 12:18:22.481000 audit: BPF prog-id=249 op=LOAD Dec 16 12:18:22.481000 audit[5018]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4935 pid=5018 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939343734313334393764366135643335623130383535313339323666 Dec 16 12:18:22.481000 audit: BPF prog-id=249 op=UNLOAD Dec 16 12:18:22.481000 audit[5018]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4935 pid=5018 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939343734313334393764366135643335623130383535313339323666 Dec 16 12:18:22.481000 audit: BPF prog-id=248 op=UNLOAD Dec 16 12:18:22.481000 audit[5018]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4935 pid=5018 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939343734313334393764366135643335623130383535313339323666 Dec 16 12:18:22.481000 audit: BPF prog-id=250 op=LOAD Dec 16 12:18:22.481000 audit[5018]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4935 pid=5018 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939343734313334393764366135643335623130383535313339323666 Dec 16 12:18:22.494648 containerd[1662]: 2025-12-16 12:18:22.187 [INFO][4865] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--5--b12717c6ea-k8s-calico--apiserver--5d558c8677--8zlvg-eth0 calico-apiserver-5d558c8677- calico-apiserver 963a1289-7bdf-4dde-a21a-b1e5da9ecfcc 853 0 2025-12-16 12:17:47 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:5d558c8677 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4547-0-0-5-b12717c6ea calico-apiserver-5d558c8677-8zlvg eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali71aede9d8c4 [] [] }} ContainerID="f0f641b1fdca51a39fd5ae931328241bea907bb676aa9a725e82e24cc20bdef2" Namespace="calico-apiserver" Pod="calico-apiserver-5d558c8677-8zlvg" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-calico--apiserver--5d558c8677--8zlvg-" Dec 16 12:18:22.494648 containerd[1662]: 2025-12-16 12:18:22.187 [INFO][4865] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="f0f641b1fdca51a39fd5ae931328241bea907bb676aa9a725e82e24cc20bdef2" Namespace="calico-apiserver" Pod="calico-apiserver-5d558c8677-8zlvg" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-calico--apiserver--5d558c8677--8zlvg-eth0" Dec 16 12:18:22.494648 containerd[1662]: 2025-12-16 12:18:22.220 [INFO][4907] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="f0f641b1fdca51a39fd5ae931328241bea907bb676aa9a725e82e24cc20bdef2" HandleID="k8s-pod-network.f0f641b1fdca51a39fd5ae931328241bea907bb676aa9a725e82e24cc20bdef2" Workload="ci--4547--0--0--5--b12717c6ea-k8s-calico--apiserver--5d558c8677--8zlvg-eth0" Dec 16 12:18:22.494648 containerd[1662]: 2025-12-16 12:18:22.220 [INFO][4907] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="f0f641b1fdca51a39fd5ae931328241bea907bb676aa9a725e82e24cc20bdef2" HandleID="k8s-pod-network.f0f641b1fdca51a39fd5ae931328241bea907bb676aa9a725e82e24cc20bdef2" Workload="ci--4547--0--0--5--b12717c6ea-k8s-calico--apiserver--5d558c8677--8zlvg-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001a0df0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4547-0-0-5-b12717c6ea", "pod":"calico-apiserver-5d558c8677-8zlvg", "timestamp":"2025-12-16 12:18:22.220005911 +0000 UTC"}, Hostname:"ci-4547-0-0-5-b12717c6ea", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:18:22.494648 containerd[1662]: 2025-12-16 12:18:22.220 [INFO][4907] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:18:22.494648 containerd[1662]: 2025-12-16 12:18:22.350 [INFO][4907] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:18:22.494648 containerd[1662]: 2025-12-16 12:18:22.350 [INFO][4907] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-5-b12717c6ea' Dec 16 12:18:22.494648 containerd[1662]: 2025-12-16 12:18:22.418 [INFO][4907] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.f0f641b1fdca51a39fd5ae931328241bea907bb676aa9a725e82e24cc20bdef2" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:22.494648 containerd[1662]: 2025-12-16 12:18:22.426 [INFO][4907] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:22.494648 containerd[1662]: 2025-12-16 12:18:22.436 [INFO][4907] ipam/ipam.go 511: Trying affinity for 192.168.79.192/26 host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:22.494648 containerd[1662]: 2025-12-16 12:18:22.439 [INFO][4907] ipam/ipam.go 158: Attempting to load block cidr=192.168.79.192/26 host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:22.494648 containerd[1662]: 2025-12-16 12:18:22.443 [INFO][4907] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.79.192/26 host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:22.494648 containerd[1662]: 2025-12-16 12:18:22.443 [INFO][4907] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.79.192/26 handle="k8s-pod-network.f0f641b1fdca51a39fd5ae931328241bea907bb676aa9a725e82e24cc20bdef2" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:22.494648 containerd[1662]: 2025-12-16 12:18:22.445 [INFO][4907] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.f0f641b1fdca51a39fd5ae931328241bea907bb676aa9a725e82e24cc20bdef2 Dec 16 12:18:22.494648 containerd[1662]: 2025-12-16 12:18:22.451 [INFO][4907] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.79.192/26 handle="k8s-pod-network.f0f641b1fdca51a39fd5ae931328241bea907bb676aa9a725e82e24cc20bdef2" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:22.494648 containerd[1662]: 2025-12-16 12:18:22.460 [INFO][4907] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.79.200/26] block=192.168.79.192/26 handle="k8s-pod-network.f0f641b1fdca51a39fd5ae931328241bea907bb676aa9a725e82e24cc20bdef2" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:22.494648 containerd[1662]: 2025-12-16 12:18:22.460 [INFO][4907] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.79.200/26] handle="k8s-pod-network.f0f641b1fdca51a39fd5ae931328241bea907bb676aa9a725e82e24cc20bdef2" host="ci-4547-0-0-5-b12717c6ea" Dec 16 12:18:22.494648 containerd[1662]: 2025-12-16 12:18:22.460 [INFO][4907] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:18:22.494648 containerd[1662]: 2025-12-16 12:18:22.460 [INFO][4907] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.79.200/26] IPv6=[] ContainerID="f0f641b1fdca51a39fd5ae931328241bea907bb676aa9a725e82e24cc20bdef2" HandleID="k8s-pod-network.f0f641b1fdca51a39fd5ae931328241bea907bb676aa9a725e82e24cc20bdef2" Workload="ci--4547--0--0--5--b12717c6ea-k8s-calico--apiserver--5d558c8677--8zlvg-eth0" Dec 16 12:18:22.495214 containerd[1662]: 2025-12-16 12:18:22.463 [INFO][4865] cni-plugin/k8s.go 418: Populated endpoint ContainerID="f0f641b1fdca51a39fd5ae931328241bea907bb676aa9a725e82e24cc20bdef2" Namespace="calico-apiserver" Pod="calico-apiserver-5d558c8677-8zlvg" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-calico--apiserver--5d558c8677--8zlvg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--b12717c6ea-k8s-calico--apiserver--5d558c8677--8zlvg-eth0", GenerateName:"calico-apiserver-5d558c8677-", Namespace:"calico-apiserver", SelfLink:"", UID:"963a1289-7bdf-4dde-a21a-b1e5da9ecfcc", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 17, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5d558c8677", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-b12717c6ea", ContainerID:"", Pod:"calico-apiserver-5d558c8677-8zlvg", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.79.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali71aede9d8c4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:18:22.495214 containerd[1662]: 2025-12-16 12:18:22.463 [INFO][4865] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.79.200/32] ContainerID="f0f641b1fdca51a39fd5ae931328241bea907bb676aa9a725e82e24cc20bdef2" Namespace="calico-apiserver" Pod="calico-apiserver-5d558c8677-8zlvg" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-calico--apiserver--5d558c8677--8zlvg-eth0" Dec 16 12:18:22.495214 containerd[1662]: 2025-12-16 12:18:22.463 [INFO][4865] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali71aede9d8c4 ContainerID="f0f641b1fdca51a39fd5ae931328241bea907bb676aa9a725e82e24cc20bdef2" Namespace="calico-apiserver" Pod="calico-apiserver-5d558c8677-8zlvg" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-calico--apiserver--5d558c8677--8zlvg-eth0" Dec 16 12:18:22.495214 containerd[1662]: 2025-12-16 12:18:22.471 [INFO][4865] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="f0f641b1fdca51a39fd5ae931328241bea907bb676aa9a725e82e24cc20bdef2" Namespace="calico-apiserver" Pod="calico-apiserver-5d558c8677-8zlvg" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-calico--apiserver--5d558c8677--8zlvg-eth0" Dec 16 12:18:22.495214 containerd[1662]: 2025-12-16 12:18:22.472 [INFO][4865] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="f0f641b1fdca51a39fd5ae931328241bea907bb676aa9a725e82e24cc20bdef2" Namespace="calico-apiserver" Pod="calico-apiserver-5d558c8677-8zlvg" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-calico--apiserver--5d558c8677--8zlvg-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--5--b12717c6ea-k8s-calico--apiserver--5d558c8677--8zlvg-eth0", GenerateName:"calico-apiserver-5d558c8677-", Namespace:"calico-apiserver", SelfLink:"", UID:"963a1289-7bdf-4dde-a21a-b1e5da9ecfcc", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 17, 47, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"5d558c8677", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-5-b12717c6ea", ContainerID:"f0f641b1fdca51a39fd5ae931328241bea907bb676aa9a725e82e24cc20bdef2", Pod:"calico-apiserver-5d558c8677-8zlvg", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.79.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali71aede9d8c4", MAC:"da:43:7f:4e:ed:aa", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:18:22.495214 containerd[1662]: 2025-12-16 12:18:22.488 [INFO][4865] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="f0f641b1fdca51a39fd5ae931328241bea907bb676aa9a725e82e24cc20bdef2" Namespace="calico-apiserver" Pod="calico-apiserver-5d558c8677-8zlvg" WorkloadEndpoint="ci--4547--0--0--5--b12717c6ea-k8s-calico--apiserver--5d558c8677--8zlvg-eth0" Dec 16 12:18:22.506000 audit[5062]: NETFILTER_CFG table=filter:137 family=2 entries=57 op=nft_register_chain pid=5062 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:18:22.506000 audit[5062]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=27812 a0=3 a1=ffffc86567f0 a2=0 a3=ffff956b4fa8 items=0 ppid=4354 pid=5062 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.506000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:18:22.514681 containerd[1662]: time="2025-12-16T12:18:22.514513534Z" level=info msg="StartContainer for \"9947413497d6a5d35b1085513926f490c16a5b78fc1e499cab26e3e50787b6ef\" returns successfully" Dec 16 12:18:22.519322 containerd[1662]: time="2025-12-16T12:18:22.519281269Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-7db869f9-8vr6z,Uid:01d79110-1504-420e-beb4-8a0f58f3b458,Namespace:calico-system,Attempt:0,} returns sandbox id \"7717fc86834e738d1ffaee4d16ff092e0f7c30bdf5f2ea86abf8d1a57bb25729\"" Dec 16 12:18:22.521398 containerd[1662]: time="2025-12-16T12:18:22.521118675Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:18:22.528118 containerd[1662]: time="2025-12-16T12:18:22.528068577Z" level=info msg="connecting to shim f0f641b1fdca51a39fd5ae931328241bea907bb676aa9a725e82e24cc20bdef2" address="unix:///run/containerd/s/8a1ad782b2ddc956d4c97af55b82bf24422ce89ad25bd1a1278e2179898c82a0" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:18:22.554029 systemd[1]: Started cri-containerd-f0f641b1fdca51a39fd5ae931328241bea907bb676aa9a725e82e24cc20bdef2.scope - libcontainer container f0f641b1fdca51a39fd5ae931328241bea907bb676aa9a725e82e24cc20bdef2. Dec 16 12:18:22.566000 audit: BPF prog-id=251 op=LOAD Dec 16 12:18:22.567000 audit: BPF prog-id=252 op=LOAD Dec 16 12:18:22.567000 audit[5094]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=5082 pid=5094 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.567000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6630663634316231666463613531613339666435616539333133323832 Dec 16 12:18:22.567000 audit: BPF prog-id=252 op=UNLOAD Dec 16 12:18:22.567000 audit[5094]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5082 pid=5094 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.567000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6630663634316231666463613531613339666435616539333133323832 Dec 16 12:18:22.567000 audit: BPF prog-id=253 op=LOAD Dec 16 12:18:22.567000 audit[5094]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=5082 pid=5094 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.567000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6630663634316231666463613531613339666435616539333133323832 Dec 16 12:18:22.567000 audit: BPF prog-id=254 op=LOAD Dec 16 12:18:22.567000 audit[5094]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=5082 pid=5094 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.567000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6630663634316231666463613531613339666435616539333133323832 Dec 16 12:18:22.567000 audit: BPF prog-id=254 op=UNLOAD Dec 16 12:18:22.567000 audit[5094]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=5082 pid=5094 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.567000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6630663634316231666463613531613339666435616539333133323832 Dec 16 12:18:22.567000 audit: BPF prog-id=253 op=UNLOAD Dec 16 12:18:22.567000 audit[5094]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5082 pid=5094 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.567000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6630663634316231666463613531613339666435616539333133323832 Dec 16 12:18:22.567000 audit: BPF prog-id=255 op=LOAD Dec 16 12:18:22.567000 audit[5094]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=5082 pid=5094 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:22.567000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6630663634316231666463613531613339666435616539333133323832 Dec 16 12:18:22.595322 containerd[1662]: time="2025-12-16T12:18:22.595283273Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-5d558c8677-8zlvg,Uid:963a1289-7bdf-4dde-a21a-b1e5da9ecfcc,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"f0f641b1fdca51a39fd5ae931328241bea907bb676aa9a725e82e24cc20bdef2\"" Dec 16 12:18:22.864027 containerd[1662]: time="2025-12-16T12:18:22.863979013Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:18:22.865779 containerd[1662]: time="2025-12-16T12:18:22.865720419Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:18:22.866031 containerd[1662]: time="2025-12-16T12:18:22.865834779Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:18:22.866070 kubelet[2932]: E1216 12:18:22.866001 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:18:22.866070 kubelet[2932]: E1216 12:18:22.866044 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:18:22.867934 kubelet[2932]: E1216 12:18:22.866208 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-7db869f9-8vr6z_calico-system(01d79110-1504-420e-beb4-8a0f58f3b458): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:18:22.867934 kubelet[2932]: E1216 12:18:22.866257 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7db869f9-8vr6z" podUID="01d79110-1504-420e-beb4-8a0f58f3b458" Dec 16 12:18:22.868247 containerd[1662]: time="2025-12-16T12:18:22.868122947Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:18:23.214586 containerd[1662]: time="2025-12-16T12:18:23.214536816Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:18:23.215877 containerd[1662]: time="2025-12-16T12:18:23.215840380Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:18:23.215927 containerd[1662]: time="2025-12-16T12:18:23.215882901Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:18:23.216086 kubelet[2932]: E1216 12:18:23.216027 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:18:23.216138 kubelet[2932]: E1216 12:18:23.216093 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:18:23.216180 kubelet[2932]: E1216 12:18:23.216162 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-5d558c8677-8zlvg_calico-apiserver(963a1289-7bdf-4dde-a21a-b1e5da9ecfcc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:18:23.216212 kubelet[2932]: E1216 12:18:23.216195 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-8zlvg" podUID="963a1289-7bdf-4dde-a21a-b1e5da9ecfcc" Dec 16 12:18:23.269300 kubelet[2932]: E1216 12:18:23.269016 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7db869f9-8vr6z" podUID="01d79110-1504-420e-beb4-8a0f58f3b458" Dec 16 12:18:23.270377 kubelet[2932]: E1216 12:18:23.270337 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-8zlvg" podUID="963a1289-7bdf-4dde-a21a-b1e5da9ecfcc" Dec 16 12:18:23.378000 audit[5124]: NETFILTER_CFG table=filter:138 family=2 entries=14 op=nft_register_rule pid=5124 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:18:23.378000 audit[5124]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffe36784f0 a2=0 a3=1 items=0 ppid=3084 pid=5124 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:23.378000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:18:23.392000 audit[5124]: NETFILTER_CFG table=nat:139 family=2 entries=56 op=nft_register_chain pid=5124 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:18:23.392000 audit[5124]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=19860 a0=3 a1=ffffe36784f0 a2=0 a3=1 items=0 ppid=3084 pid=5124 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:23.392000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:18:23.669093 systemd-networkd[1583]: cali71aede9d8c4: Gained IPv6LL Dec 16 12:18:23.733092 systemd-networkd[1583]: cali8003fa465d9: Gained IPv6LL Dec 16 12:18:23.733444 systemd-networkd[1583]: cali3389b4ad3c8: Gained IPv6LL Dec 16 12:18:24.274715 kubelet[2932]: E1216 12:18:24.274667 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7db869f9-8vr6z" podUID="01d79110-1504-420e-beb4-8a0f58f3b458" Dec 16 12:18:24.275687 kubelet[2932]: E1216 12:18:24.275525 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-8zlvg" podUID="963a1289-7bdf-4dde-a21a-b1e5da9ecfcc" Dec 16 12:18:24.288060 kubelet[2932]: I1216 12:18:24.287992 2932 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-hvtrp" podStartSLOduration=47.287958174 podStartE2EDuration="47.287958174s" podCreationTimestamp="2025-12-16 12:17:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:18:23.303665342 +0000 UTC m=+51.354580453" watchObservedRunningTime="2025-12-16 12:18:24.287958174 +0000 UTC m=+52.338873285" Dec 16 12:18:25.110491 containerd[1662]: time="2025-12-16T12:18:25.110431169Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:18:25.464299 containerd[1662]: time="2025-12-16T12:18:25.464175902Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:18:25.465697 containerd[1662]: time="2025-12-16T12:18:25.465635026Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:18:25.465746 containerd[1662]: time="2025-12-16T12:18:25.465693427Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:18:25.465985 kubelet[2932]: E1216 12:18:25.465882 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:18:25.465985 kubelet[2932]: E1216 12:18:25.465944 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:18:25.466540 kubelet[2932]: E1216 12:18:25.466361 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-74bfdd549c-rv9w7_calico-system(29a341e5-f159-490a-9c2d-ec19b832725b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:18:25.467286 containerd[1662]: time="2025-12-16T12:18:25.467256832Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:18:25.797964 containerd[1662]: time="2025-12-16T12:18:25.797767930Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:18:25.799460 containerd[1662]: time="2025-12-16T12:18:25.799407535Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:18:25.799649 containerd[1662]: time="2025-12-16T12:18:25.799488216Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:18:25.799689 kubelet[2932]: E1216 12:18:25.799611 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:18:25.799689 kubelet[2932]: E1216 12:18:25.799653 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:18:25.799770 kubelet[2932]: E1216 12:18:25.799715 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-74bfdd549c-rv9w7_calico-system(29a341e5-f159-490a-9c2d-ec19b832725b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:18:25.799770 kubelet[2932]: E1216 12:18:25.799751 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-74bfdd549c-rv9w7" podUID="29a341e5-f159-490a-9c2d-ec19b832725b" Dec 16 12:18:33.113431 containerd[1662]: time="2025-12-16T12:18:33.112963400Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:18:33.455315 containerd[1662]: time="2025-12-16T12:18:33.455255177Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:18:33.456828 containerd[1662]: time="2025-12-16T12:18:33.456735541Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:18:33.456828 containerd[1662]: time="2025-12-16T12:18:33.456772502Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:18:33.457253 kubelet[2932]: E1216 12:18:33.456960 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:18:33.457253 kubelet[2932]: E1216 12:18:33.457005 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:18:33.457253 kubelet[2932]: E1216 12:18:33.457076 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-xvgpm_calico-system(f42d8c07-697c-492a-826b-630e49b3282d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:18:33.457253 kubelet[2932]: E1216 12:18:33.457108 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-xvgpm" podUID="f42d8c07-697c-492a-826b-630e49b3282d" Dec 16 12:18:35.111043 containerd[1662]: time="2025-12-16T12:18:35.110795719Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:18:35.451595 containerd[1662]: time="2025-12-16T12:18:35.451171530Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:18:35.453213 containerd[1662]: time="2025-12-16T12:18:35.453110056Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:18:35.453318 containerd[1662]: time="2025-12-16T12:18:35.453133536Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:18:35.453437 kubelet[2932]: E1216 12:18:35.453382 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:18:35.453437 kubelet[2932]: E1216 12:18:35.453432 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:18:35.453908 kubelet[2932]: E1216 12:18:35.453589 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-tp52x_calico-system(1d6077b5-49a5-4d21-bd6b-0ffae41c4da0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:18:35.454142 containerd[1662]: time="2025-12-16T12:18:35.454002139Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:18:35.807273 containerd[1662]: time="2025-12-16T12:18:35.806937869Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:18:35.808224 containerd[1662]: time="2025-12-16T12:18:35.808190593Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:18:35.808426 kubelet[2932]: E1216 12:18:35.808392 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:18:35.808490 kubelet[2932]: E1216 12:18:35.808438 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:18:35.808735 containerd[1662]: time="2025-12-16T12:18:35.808255153Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:18:35.808791 kubelet[2932]: E1216 12:18:35.808657 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-5d558c8677-khwzf_calico-apiserver(c36f8480-80e5-4b1d-b264-0fc54133bb26): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:18:35.808791 kubelet[2932]: E1216 12:18:35.808703 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-khwzf" podUID="c36f8480-80e5-4b1d-b264-0fc54133bb26" Dec 16 12:18:35.809192 containerd[1662]: time="2025-12-16T12:18:35.809020076Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:18:36.147141 containerd[1662]: time="2025-12-16T12:18:36.147090719Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:18:36.149584 containerd[1662]: time="2025-12-16T12:18:36.149475006Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:18:36.149584 containerd[1662]: time="2025-12-16T12:18:36.149530446Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:18:36.149744 kubelet[2932]: E1216 12:18:36.149710 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:18:36.149841 kubelet[2932]: E1216 12:18:36.149760 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:18:36.150011 kubelet[2932]: E1216 12:18:36.149973 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-tp52x_calico-system(1d6077b5-49a5-4d21-bd6b-0ffae41c4da0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:18:36.150062 kubelet[2932]: E1216 12:18:36.150036 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-tp52x" podUID="1d6077b5-49a5-4d21-bd6b-0ffae41c4da0" Dec 16 12:18:36.150142 containerd[1662]: time="2025-12-16T12:18:36.150104808Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:18:36.492092 containerd[1662]: time="2025-12-16T12:18:36.491919503Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:18:36.493123 containerd[1662]: time="2025-12-16T12:18:36.493088587Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:18:36.493199 containerd[1662]: time="2025-12-16T12:18:36.493165507Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:18:36.493376 kubelet[2932]: E1216 12:18:36.493340 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:18:36.493646 kubelet[2932]: E1216 12:18:36.493386 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:18:36.493646 kubelet[2932]: E1216 12:18:36.493459 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-5d558c8677-8zlvg_calico-apiserver(963a1289-7bdf-4dde-a21a-b1e5da9ecfcc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:18:36.493646 kubelet[2932]: E1216 12:18:36.493493 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-8zlvg" podUID="963a1289-7bdf-4dde-a21a-b1e5da9ecfcc" Dec 16 12:18:38.111201 containerd[1662]: time="2025-12-16T12:18:38.110967289Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:18:38.461003 containerd[1662]: time="2025-12-16T12:18:38.460952530Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:18:38.462358 containerd[1662]: time="2025-12-16T12:18:38.462315134Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:18:38.462423 containerd[1662]: time="2025-12-16T12:18:38.462365934Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:18:38.462641 kubelet[2932]: E1216 12:18:38.462581 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:18:38.462641 kubelet[2932]: E1216 12:18:38.462629 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:18:38.462994 kubelet[2932]: E1216 12:18:38.462711 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-7db869f9-8vr6z_calico-system(01d79110-1504-420e-beb4-8a0f58f3b458): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:18:38.462994 kubelet[2932]: E1216 12:18:38.462747 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7db869f9-8vr6z" podUID="01d79110-1504-420e-beb4-8a0f58f3b458" Dec 16 12:18:40.111562 kubelet[2932]: E1216 12:18:40.111499 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-74bfdd549c-rv9w7" podUID="29a341e5-f159-490a-9c2d-ec19b832725b" Dec 16 12:18:48.112261 kubelet[2932]: E1216 12:18:48.112202 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-xvgpm" podUID="f42d8c07-697c-492a-826b-630e49b3282d" Dec 16 12:18:48.115223 kubelet[2932]: E1216 12:18:48.113199 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-tp52x" podUID="1d6077b5-49a5-4d21-bd6b-0ffae41c4da0" Dec 16 12:18:51.111913 kubelet[2932]: E1216 12:18:51.111478 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7db869f9-8vr6z" podUID="01d79110-1504-420e-beb4-8a0f58f3b458" Dec 16 12:18:51.113188 kubelet[2932]: E1216 12:18:51.112731 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-8zlvg" podUID="963a1289-7bdf-4dde-a21a-b1e5da9ecfcc" Dec 16 12:18:51.113369 kubelet[2932]: E1216 12:18:51.113325 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-khwzf" podUID="c36f8480-80e5-4b1d-b264-0fc54133bb26" Dec 16 12:18:52.112392 containerd[1662]: time="2025-12-16T12:18:52.112347935Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:18:52.474841 containerd[1662]: time="2025-12-16T12:18:52.474773896Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:18:52.476290 containerd[1662]: time="2025-12-16T12:18:52.476224500Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:18:52.476428 containerd[1662]: time="2025-12-16T12:18:52.476275261Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:18:52.476502 kubelet[2932]: E1216 12:18:52.476460 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:18:52.476795 kubelet[2932]: E1216 12:18:52.476507 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:18:52.476795 kubelet[2932]: E1216 12:18:52.476588 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-74bfdd549c-rv9w7_calico-system(29a341e5-f159-490a-9c2d-ec19b832725b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:18:52.477833 containerd[1662]: time="2025-12-16T12:18:52.477612625Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:18:52.811947 containerd[1662]: time="2025-12-16T12:18:52.810948093Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:18:52.814524 containerd[1662]: time="2025-12-16T12:18:52.814478584Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:18:52.814630 containerd[1662]: time="2025-12-16T12:18:52.814568704Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:18:52.814776 kubelet[2932]: E1216 12:18:52.814738 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:18:52.814832 kubelet[2932]: E1216 12:18:52.814789 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:18:52.814896 kubelet[2932]: E1216 12:18:52.814876 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-74bfdd549c-rv9w7_calico-system(29a341e5-f159-490a-9c2d-ec19b832725b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:18:52.814946 kubelet[2932]: E1216 12:18:52.814921 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-74bfdd549c-rv9w7" podUID="29a341e5-f159-490a-9c2d-ec19b832725b" Dec 16 12:18:59.111677 containerd[1662]: time="2025-12-16T12:18:59.111300632Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:18:59.462986 containerd[1662]: time="2025-12-16T12:18:59.462902118Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:18:59.464641 containerd[1662]: time="2025-12-16T12:18:59.464574644Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:18:59.464740 containerd[1662]: time="2025-12-16T12:18:59.464679444Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:18:59.464917 kubelet[2932]: E1216 12:18:59.464871 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:18:59.465472 kubelet[2932]: E1216 12:18:59.464924 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:18:59.465472 kubelet[2932]: E1216 12:18:59.464987 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-xvgpm_calico-system(f42d8c07-697c-492a-826b-630e49b3282d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:18:59.465472 kubelet[2932]: E1216 12:18:59.465018 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-xvgpm" podUID="f42d8c07-697c-492a-826b-630e49b3282d" Dec 16 12:19:00.110169 containerd[1662]: time="2025-12-16T12:19:00.110084871Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:19:00.462276 containerd[1662]: time="2025-12-16T12:19:00.462223879Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:00.463404 containerd[1662]: time="2025-12-16T12:19:00.463368803Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:19:00.463470 containerd[1662]: time="2025-12-16T12:19:00.463402923Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:00.463665 kubelet[2932]: E1216 12:19:00.463613 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:19:00.463770 kubelet[2932]: E1216 12:19:00.463755 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:19:00.464059 kubelet[2932]: E1216 12:19:00.463944 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-tp52x_calico-system(1d6077b5-49a5-4d21-bd6b-0ffae41c4da0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:00.467044 containerd[1662]: time="2025-12-16T12:19:00.466996134Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:19:00.834208 containerd[1662]: time="2025-12-16T12:19:00.834015910Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:00.835353 containerd[1662]: time="2025-12-16T12:19:00.835288554Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:19:00.835418 containerd[1662]: time="2025-12-16T12:19:00.835312634Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:00.835582 kubelet[2932]: E1216 12:19:00.835545 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:19:00.836022 kubelet[2932]: E1216 12:19:00.835593 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:19:00.836022 kubelet[2932]: E1216 12:19:00.835660 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-tp52x_calico-system(1d6077b5-49a5-4d21-bd6b-0ffae41c4da0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:00.836022 kubelet[2932]: E1216 12:19:00.835697 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-tp52x" podUID="1d6077b5-49a5-4d21-bd6b-0ffae41c4da0" Dec 16 12:19:04.112442 containerd[1662]: time="2025-12-16T12:19:04.112202010Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:19:04.447054 containerd[1662]: time="2025-12-16T12:19:04.446962522Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:04.448284 containerd[1662]: time="2025-12-16T12:19:04.448238286Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:04.448927 containerd[1662]: time="2025-12-16T12:19:04.448891128Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:19:04.449132 kubelet[2932]: E1216 12:19:04.449099 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:19:04.449497 kubelet[2932]: E1216 12:19:04.449151 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:19:04.449497 kubelet[2932]: E1216 12:19:04.449217 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-5d558c8677-8zlvg_calico-apiserver(963a1289-7bdf-4dde-a21a-b1e5da9ecfcc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:04.449497 kubelet[2932]: E1216 12:19:04.449247 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-8zlvg" podUID="963a1289-7bdf-4dde-a21a-b1e5da9ecfcc" Dec 16 12:19:05.110898 containerd[1662]: time="2025-12-16T12:19:05.110854409Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:19:05.480399 containerd[1662]: time="2025-12-16T12:19:05.480334512Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:05.482689 containerd[1662]: time="2025-12-16T12:19:05.482639399Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:19:05.482751 containerd[1662]: time="2025-12-16T12:19:05.482696240Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:05.482944 kubelet[2932]: E1216 12:19:05.482886 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:19:05.482944 kubelet[2932]: E1216 12:19:05.482943 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:19:05.483206 kubelet[2932]: E1216 12:19:05.483014 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-5d558c8677-khwzf_calico-apiserver(c36f8480-80e5-4b1d-b264-0fc54133bb26): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:05.483206 kubelet[2932]: E1216 12:19:05.483045 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-khwzf" podUID="c36f8480-80e5-4b1d-b264-0fc54133bb26" Dec 16 12:19:06.112261 kubelet[2932]: E1216 12:19:06.112166 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-74bfdd549c-rv9w7" podUID="29a341e5-f159-490a-9c2d-ec19b832725b" Dec 16 12:19:06.112800 containerd[1662]: time="2025-12-16T12:19:06.112457017Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:19:06.493542 containerd[1662]: time="2025-12-16T12:19:06.493493157Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:06.494731 containerd[1662]: time="2025-12-16T12:19:06.494673161Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:19:06.494805 containerd[1662]: time="2025-12-16T12:19:06.494760441Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:06.495035 kubelet[2932]: E1216 12:19:06.494928 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:19:06.495035 kubelet[2932]: E1216 12:19:06.494988 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:19:06.495383 kubelet[2932]: E1216 12:19:06.495059 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-7db869f9-8vr6z_calico-system(01d79110-1504-420e-beb4-8a0f58f3b458): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:06.495383 kubelet[2932]: E1216 12:19:06.495091 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7db869f9-8vr6z" podUID="01d79110-1504-420e-beb4-8a0f58f3b458" Dec 16 12:19:13.111139 kubelet[2932]: E1216 12:19:13.111088 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-xvgpm" podUID="f42d8c07-697c-492a-826b-630e49b3282d" Dec 16 12:19:13.111776 kubelet[2932]: E1216 12:19:13.111512 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-tp52x" podUID="1d6077b5-49a5-4d21-bd6b-0ffae41c4da0" Dec 16 12:19:15.110318 kubelet[2932]: E1216 12:19:15.110267 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-8zlvg" podUID="963a1289-7bdf-4dde-a21a-b1e5da9ecfcc" Dec 16 12:19:17.111635 kubelet[2932]: E1216 12:19:17.111541 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-khwzf" podUID="c36f8480-80e5-4b1d-b264-0fc54133bb26" Dec 16 12:19:17.112532 kubelet[2932]: E1216 12:19:17.112487 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-74bfdd549c-rv9w7" podUID="29a341e5-f159-490a-9c2d-ec19b832725b" Dec 16 12:19:21.109672 kubelet[2932]: E1216 12:19:21.109606 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7db869f9-8vr6z" podUID="01d79110-1504-420e-beb4-8a0f58f3b458" Dec 16 12:19:24.111144 kubelet[2932]: E1216 12:19:24.110575 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-tp52x" podUID="1d6077b5-49a5-4d21-bd6b-0ffae41c4da0" Dec 16 12:19:28.113296 kubelet[2932]: E1216 12:19:28.112139 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-xvgpm" podUID="f42d8c07-697c-492a-826b-630e49b3282d" Dec 16 12:19:28.113296 kubelet[2932]: E1216 12:19:28.112325 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-74bfdd549c-rv9w7" podUID="29a341e5-f159-490a-9c2d-ec19b832725b" Dec 16 12:19:29.110505 kubelet[2932]: E1216 12:19:29.110455 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-khwzf" podUID="c36f8480-80e5-4b1d-b264-0fc54133bb26" Dec 16 12:19:30.112795 kubelet[2932]: E1216 12:19:30.110844 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-8zlvg" podUID="963a1289-7bdf-4dde-a21a-b1e5da9ecfcc" Dec 16 12:19:35.110682 kubelet[2932]: E1216 12:19:35.110310 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7db869f9-8vr6z" podUID="01d79110-1504-420e-beb4-8a0f58f3b458" Dec 16 12:19:37.111534 kubelet[2932]: E1216 12:19:37.111490 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-tp52x" podUID="1d6077b5-49a5-4d21-bd6b-0ffae41c4da0" Dec 16 12:19:41.110740 containerd[1662]: time="2025-12-16T12:19:41.110654675Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:19:41.111390 kubelet[2932]: E1216 12:19:41.110532 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-khwzf" podUID="c36f8480-80e5-4b1d-b264-0fc54133bb26" Dec 16 12:19:41.488908 containerd[1662]: time="2025-12-16T12:19:41.488852846Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:41.494661 containerd[1662]: time="2025-12-16T12:19:41.494583744Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:19:41.494847 containerd[1662]: time="2025-12-16T12:19:41.494694745Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:41.494924 kubelet[2932]: E1216 12:19:41.494878 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:19:41.494973 kubelet[2932]: E1216 12:19:41.494933 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:19:41.495603 kubelet[2932]: E1216 12:19:41.495012 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-74bfdd549c-rv9w7_calico-system(29a341e5-f159-490a-9c2d-ec19b832725b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:41.497079 containerd[1662]: time="2025-12-16T12:19:41.497024952Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:19:41.998531 containerd[1662]: time="2025-12-16T12:19:41.998462118Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:42.000286 containerd[1662]: time="2025-12-16T12:19:42.000231564Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:19:42.000356 containerd[1662]: time="2025-12-16T12:19:42.000294684Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:42.000475 kubelet[2932]: E1216 12:19:42.000434 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:19:42.000526 kubelet[2932]: E1216 12:19:42.000485 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:19:42.000580 kubelet[2932]: E1216 12:19:42.000560 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-74bfdd549c-rv9w7_calico-system(29a341e5-f159-490a-9c2d-ec19b832725b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:42.000640 kubelet[2932]: E1216 12:19:42.000601 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-74bfdd549c-rv9w7" podUID="29a341e5-f159-490a-9c2d-ec19b832725b" Dec 16 12:19:43.111960 containerd[1662]: time="2025-12-16T12:19:43.111919365Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:19:43.436106 containerd[1662]: time="2025-12-16T12:19:43.436044723Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:43.437426 containerd[1662]: time="2025-12-16T12:19:43.437355127Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:19:43.437603 containerd[1662]: time="2025-12-16T12:19:43.437407127Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:43.437651 kubelet[2932]: E1216 12:19:43.437589 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:19:43.437651 kubelet[2932]: E1216 12:19:43.437636 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:19:43.437970 kubelet[2932]: E1216 12:19:43.437701 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-xvgpm_calico-system(f42d8c07-697c-492a-826b-630e49b3282d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:43.437970 kubelet[2932]: E1216 12:19:43.437734 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-xvgpm" podUID="f42d8c07-697c-492a-826b-630e49b3282d" Dec 16 12:19:45.110187 containerd[1662]: time="2025-12-16T12:19:45.110135206Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:19:45.464215 containerd[1662]: time="2025-12-16T12:19:45.464101300Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:45.466405 containerd[1662]: time="2025-12-16T12:19:45.466355227Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:19:45.466472 containerd[1662]: time="2025-12-16T12:19:45.466441227Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:45.466817 kubelet[2932]: E1216 12:19:45.466738 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:19:45.466817 kubelet[2932]: E1216 12:19:45.466780 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:19:45.467458 kubelet[2932]: E1216 12:19:45.467195 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-5d558c8677-8zlvg_calico-apiserver(963a1289-7bdf-4dde-a21a-b1e5da9ecfcc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:45.467458 kubelet[2932]: E1216 12:19:45.467256 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-8zlvg" podUID="963a1289-7bdf-4dde-a21a-b1e5da9ecfcc" Dec 16 12:19:47.110964 containerd[1662]: time="2025-12-16T12:19:47.110713014Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:19:47.440929 containerd[1662]: time="2025-12-16T12:19:47.440873311Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:47.442424 containerd[1662]: time="2025-12-16T12:19:47.442362036Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:19:47.442989 containerd[1662]: time="2025-12-16T12:19:47.442381236Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:47.443030 kubelet[2932]: E1216 12:19:47.442631 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:19:47.443030 kubelet[2932]: E1216 12:19:47.442679 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:19:47.443030 kubelet[2932]: E1216 12:19:47.442755 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-7db869f9-8vr6z_calico-system(01d79110-1504-420e-beb4-8a0f58f3b458): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:47.443030 kubelet[2932]: E1216 12:19:47.442786 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7db869f9-8vr6z" podUID="01d79110-1504-420e-beb4-8a0f58f3b458" Dec 16 12:19:48.110975 containerd[1662]: time="2025-12-16T12:19:48.110930697Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:19:48.455850 containerd[1662]: time="2025-12-16T12:19:48.455626641Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:48.459235 containerd[1662]: time="2025-12-16T12:19:48.459160773Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:19:48.459483 kubelet[2932]: E1216 12:19:48.459388 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:19:48.459483 kubelet[2932]: E1216 12:19:48.459437 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:19:48.459927 containerd[1662]: time="2025-12-16T12:19:48.459435014Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:48.459960 kubelet[2932]: E1216 12:19:48.459912 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-tp52x_calico-system(1d6077b5-49a5-4d21-bd6b-0ffae41c4da0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:48.461989 containerd[1662]: time="2025-12-16T12:19:48.461943782Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:19:48.792136 containerd[1662]: time="2025-12-16T12:19:48.791984159Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:48.793908 containerd[1662]: time="2025-12-16T12:19:48.793800765Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:19:48.794022 containerd[1662]: time="2025-12-16T12:19:48.793889605Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:48.794143 kubelet[2932]: E1216 12:19:48.794089 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:19:48.794192 kubelet[2932]: E1216 12:19:48.794151 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:19:48.794273 kubelet[2932]: E1216 12:19:48.794225 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-tp52x_calico-system(1d6077b5-49a5-4d21-bd6b-0ffae41c4da0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:48.794311 kubelet[2932]: E1216 12:19:48.794266 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-tp52x" podUID="1d6077b5-49a5-4d21-bd6b-0ffae41c4da0" Dec 16 12:19:53.994000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.29.66:22-139.178.68.195:57898 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:19:53.994849 systemd[1]: Started sshd@9-10.0.29.66:22-139.178.68.195:57898.service - OpenSSH per-connection server daemon (139.178.68.195:57898). Dec 16 12:19:53.996078 kernel: kauditd_printk_skb: 233 callbacks suppressed Dec 16 12:19:53.996115 kernel: audit: type=1130 audit(1765887593.994:742): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.29.66:22-139.178.68.195:57898 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:19:54.113274 kubelet[2932]: E1216 12:19:54.113201 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-74bfdd549c-rv9w7" podUID="29a341e5-f159-490a-9c2d-ec19b832725b" Dec 16 12:19:54.830000 audit[5286]: USER_ACCT pid=5286 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:19:54.831350 sshd[5286]: Accepted publickey for core from 139.178.68.195 port 57898 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:19:54.834929 kernel: audit: type=1101 audit(1765887594.830:743): pid=5286 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:19:54.834000 audit[5286]: CRED_ACQ pid=5286 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:19:54.836617 sshd-session[5286]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:19:54.838833 kernel: audit: type=1103 audit(1765887594.834:744): pid=5286 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:19:54.838889 kernel: audit: type=1006 audit(1765887594.834:745): pid=5286 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Dec 16 12:19:54.834000 audit[5286]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd26ee690 a2=3 a3=0 items=0 ppid=1 pid=5286 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.844009 kernel: audit: type=1300 audit(1765887594.834:745): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd26ee690 a2=3 a3=0 items=0 ppid=1 pid=5286 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:54.834000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:19:54.845276 kernel: audit: type=1327 audit(1765887594.834:745): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:19:54.848430 systemd-logind[1646]: New session 11 of user core. Dec 16 12:19:54.856044 systemd[1]: Started session-11.scope - Session 11 of User core. Dec 16 12:19:54.858000 audit[5286]: USER_START pid=5286 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:19:54.862000 audit[5290]: CRED_ACQ pid=5290 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:19:54.866466 kernel: audit: type=1105 audit(1765887594.858:746): pid=5286 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:19:54.866521 kernel: audit: type=1103 audit(1765887594.862:747): pid=5290 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:19:55.111182 containerd[1662]: time="2025-12-16T12:19:55.111025158Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:19:55.391386 sshd[5290]: Connection closed by 139.178.68.195 port 57898 Dec 16 12:19:55.391967 sshd-session[5286]: pam_unix(sshd:session): session closed for user core Dec 16 12:19:55.393000 audit[5286]: USER_END pid=5286 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:19:55.397716 systemd[1]: session-11.scope: Deactivated successfully. Dec 16 12:19:55.399245 systemd-logind[1646]: Session 11 logged out. Waiting for processes to exit. Dec 16 12:19:55.393000 audit[5286]: CRED_DISP pid=5286 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:19:55.401688 systemd[1]: sshd@9-10.0.29.66:22-139.178.68.195:57898.service: Deactivated successfully. Dec 16 12:19:55.403516 kernel: audit: type=1106 audit(1765887595.393:748): pid=5286 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:19:55.403601 kernel: audit: type=1104 audit(1765887595.393:749): pid=5286 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:19:55.400000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.29.66:22-139.178.68.195:57898 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:19:55.406329 systemd-logind[1646]: Removed session 11. Dec 16 12:19:55.443031 containerd[1662]: time="2025-12-16T12:19:55.442939022Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:55.445275 containerd[1662]: time="2025-12-16T12:19:55.445202989Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:19:55.445529 containerd[1662]: time="2025-12-16T12:19:55.445235429Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:55.445758 kubelet[2932]: E1216 12:19:55.445717 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:19:55.446224 kubelet[2932]: E1216 12:19:55.445868 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:19:55.446449 kubelet[2932]: E1216 12:19:55.446363 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-5d558c8677-khwzf_calico-apiserver(c36f8480-80e5-4b1d-b264-0fc54133bb26): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:55.446449 kubelet[2932]: E1216 12:19:55.446409 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-khwzf" podUID="c36f8480-80e5-4b1d-b264-0fc54133bb26" Dec 16 12:19:56.111124 kubelet[2932]: E1216 12:19:56.111027 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-xvgpm" podUID="f42d8c07-697c-492a-826b-630e49b3282d" Dec 16 12:19:57.111199 kubelet[2932]: E1216 12:19:57.110240 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-8zlvg" podUID="963a1289-7bdf-4dde-a21a-b1e5da9ecfcc" Dec 16 12:20:00.559143 systemd[1]: Started sshd@10-10.0.29.66:22-139.178.68.195:43216.service - OpenSSH per-connection server daemon (139.178.68.195:43216). Dec 16 12:20:00.558000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.29.66:22-139.178.68.195:43216 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:00.559967 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:20:00.560084 kernel: audit: type=1130 audit(1765887600.558:751): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.29.66:22-139.178.68.195:43216 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:01.110716 kubelet[2932]: E1216 12:20:01.110569 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7db869f9-8vr6z" podUID="01d79110-1504-420e-beb4-8a0f58f3b458" Dec 16 12:20:01.386000 audit[5304]: USER_ACCT pid=5304 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:01.387431 sshd[5304]: Accepted publickey for core from 139.178.68.195 port 43216 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:20:01.389745 sshd-session[5304]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:20:01.388000 audit[5304]: CRED_ACQ pid=5304 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:01.392893 kernel: audit: type=1101 audit(1765887601.386:752): pid=5304 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:01.392961 kernel: audit: type=1103 audit(1765887601.388:753): pid=5304 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:01.394651 kernel: audit: type=1006 audit(1765887601.388:754): pid=5304 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=12 res=1 Dec 16 12:20:01.388000 audit[5304]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe323b4e0 a2=3 a3=0 items=0 ppid=1 pid=5304 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:01.397937 kernel: audit: type=1300 audit(1765887601.388:754): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe323b4e0 a2=3 a3=0 items=0 ppid=1 pid=5304 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:01.398045 kernel: audit: type=1327 audit(1765887601.388:754): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:20:01.388000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:20:01.398246 systemd-logind[1646]: New session 12 of user core. Dec 16 12:20:01.409311 systemd[1]: Started session-12.scope - Session 12 of User core. Dec 16 12:20:01.411000 audit[5304]: USER_START pid=5304 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:01.413000 audit[5308]: CRED_ACQ pid=5308 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:01.418331 kernel: audit: type=1105 audit(1765887601.411:755): pid=5304 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:01.418464 kernel: audit: type=1103 audit(1765887601.413:756): pid=5308 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:01.925199 sshd[5308]: Connection closed by 139.178.68.195 port 43216 Dec 16 12:20:01.925281 sshd-session[5304]: pam_unix(sshd:session): session closed for user core Dec 16 12:20:01.927000 audit[5304]: USER_END pid=5304 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:01.932220 systemd[1]: sshd@10-10.0.29.66:22-139.178.68.195:43216.service: Deactivated successfully. Dec 16 12:20:01.927000 audit[5304]: CRED_DISP pid=5304 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:01.935492 systemd[1]: session-12.scope: Deactivated successfully. Dec 16 12:20:01.937339 kernel: audit: type=1106 audit(1765887601.927:757): pid=5304 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:01.937445 kernel: audit: type=1104 audit(1765887601.927:758): pid=5304 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:01.933000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.29.66:22-139.178.68.195:43216 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:01.940163 systemd-logind[1646]: Session 12 logged out. Waiting for processes to exit. Dec 16 12:20:01.941860 systemd-logind[1646]: Removed session 12. Dec 16 12:20:02.111865 kubelet[2932]: E1216 12:20:02.111769 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-tp52x" podUID="1d6077b5-49a5-4d21-bd6b-0ffae41c4da0" Dec 16 12:20:05.112058 kubelet[2932]: E1216 12:20:05.112004 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-74bfdd549c-rv9w7" podUID="29a341e5-f159-490a-9c2d-ec19b832725b" Dec 16 12:20:07.095459 systemd[1]: Started sshd@11-10.0.29.66:22-139.178.68.195:43222.service - OpenSSH per-connection server daemon (139.178.68.195:43222). Dec 16 12:20:07.094000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.29.66:22-139.178.68.195:43222 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:07.096169 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:20:07.096211 kernel: audit: type=1130 audit(1765887607.094:760): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.29.66:22-139.178.68.195:43222 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:07.925000 audit[5322]: USER_ACCT pid=5322 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:07.926909 sshd[5322]: Accepted publickey for core from 139.178.68.195 port 43222 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:20:07.929000 audit[5322]: CRED_ACQ pid=5322 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:07.931681 sshd-session[5322]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:20:07.933730 kernel: audit: type=1101 audit(1765887607.925:761): pid=5322 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:07.933793 kernel: audit: type=1103 audit(1765887607.929:762): pid=5322 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:07.933825 kernel: audit: type=1006 audit(1765887607.929:763): pid=5322 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=13 res=1 Dec 16 12:20:07.935378 kernel: audit: type=1300 audit(1765887607.929:763): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdb0af790 a2=3 a3=0 items=0 ppid=1 pid=5322 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:07.929000 audit[5322]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdb0af790 a2=3 a3=0 items=0 ppid=1 pid=5322 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:07.936502 systemd-logind[1646]: New session 13 of user core. Dec 16 12:20:07.929000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:20:07.939535 kernel: audit: type=1327 audit(1765887607.929:763): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:20:07.948154 systemd[1]: Started session-13.scope - Session 13 of User core. Dec 16 12:20:07.950000 audit[5322]: USER_START pid=5322 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:07.952000 audit[5328]: CRED_ACQ pid=5328 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:07.957506 kernel: audit: type=1105 audit(1765887607.950:764): pid=5322 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:07.957600 kernel: audit: type=1103 audit(1765887607.952:765): pid=5328 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:08.115604 kubelet[2932]: E1216 12:20:08.115559 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-khwzf" podUID="c36f8480-80e5-4b1d-b264-0fc54133bb26" Dec 16 12:20:08.116036 kubelet[2932]: E1216 12:20:08.116012 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-xvgpm" podUID="f42d8c07-697c-492a-826b-630e49b3282d" Dec 16 12:20:08.473074 sshd[5328]: Connection closed by 139.178.68.195 port 43222 Dec 16 12:20:08.473518 sshd-session[5322]: pam_unix(sshd:session): session closed for user core Dec 16 12:20:08.474000 audit[5322]: USER_END pid=5322 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:08.474000 audit[5322]: CRED_DISP pid=5322 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:08.479771 systemd[1]: sshd@11-10.0.29.66:22-139.178.68.195:43222.service: Deactivated successfully. Dec 16 12:20:08.480215 systemd-logind[1646]: Session 13 logged out. Waiting for processes to exit. Dec 16 12:20:08.481744 kernel: audit: type=1106 audit(1765887608.474:766): pid=5322 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:08.481830 kernel: audit: type=1104 audit(1765887608.474:767): pid=5322 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:08.482333 systemd[1]: session-13.scope: Deactivated successfully. Dec 16 12:20:08.479000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.29.66:22-139.178.68.195:43222 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:08.483841 systemd-logind[1646]: Removed session 13. Dec 16 12:20:08.645000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.29.66:22-139.178.68.195:43228 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:08.646445 systemd[1]: Started sshd@12-10.0.29.66:22-139.178.68.195:43228.service - OpenSSH per-connection server daemon (139.178.68.195:43228). Dec 16 12:20:09.468000 audit[5342]: USER_ACCT pid=5342 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:09.469575 sshd[5342]: Accepted publickey for core from 139.178.68.195 port 43228 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:20:09.470000 audit[5342]: CRED_ACQ pid=5342 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:09.470000 audit[5342]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff89ede90 a2=3 a3=0 items=0 ppid=1 pid=5342 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:09.470000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:20:09.471750 sshd-session[5342]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:20:09.482890 systemd-logind[1646]: New session 14 of user core. Dec 16 12:20:09.493130 systemd[1]: Started session-14.scope - Session 14 of User core. Dec 16 12:20:09.496000 audit[5342]: USER_START pid=5342 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:09.498000 audit[5346]: CRED_ACQ pid=5346 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:10.041004 sshd[5346]: Connection closed by 139.178.68.195 port 43228 Dec 16 12:20:10.041331 sshd-session[5342]: pam_unix(sshd:session): session closed for user core Dec 16 12:20:10.043000 audit[5342]: USER_END pid=5342 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:10.043000 audit[5342]: CRED_DISP pid=5342 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:10.049886 systemd[1]: sshd@12-10.0.29.66:22-139.178.68.195:43228.service: Deactivated successfully. Dec 16 12:20:10.049000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.29.66:22-139.178.68.195:43228 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:10.052011 systemd[1]: session-14.scope: Deactivated successfully. Dec 16 12:20:10.055337 systemd-logind[1646]: Session 14 logged out. Waiting for processes to exit. Dec 16 12:20:10.059773 systemd-logind[1646]: Removed session 14. Dec 16 12:20:10.112562 kubelet[2932]: E1216 12:20:10.112397 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-8zlvg" podUID="963a1289-7bdf-4dde-a21a-b1e5da9ecfcc" Dec 16 12:20:10.225368 systemd[1]: Started sshd@13-10.0.29.66:22-139.178.68.195:43244.service - OpenSSH per-connection server daemon (139.178.68.195:43244). Dec 16 12:20:10.224000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.29.66:22-139.178.68.195:43244 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:11.092000 audit[5358]: USER_ACCT pid=5358 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:11.093138 sshd[5358]: Accepted publickey for core from 139.178.68.195 port 43244 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:20:11.093000 audit[5358]: CRED_ACQ pid=5358 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:11.093000 audit[5358]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe6a3db50 a2=3 a3=0 items=0 ppid=1 pid=5358 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:11.093000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:20:11.095157 sshd-session[5358]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:20:11.100894 systemd-logind[1646]: New session 15 of user core. Dec 16 12:20:11.107016 systemd[1]: Started session-15.scope - Session 15 of User core. Dec 16 12:20:11.110000 audit[5358]: USER_START pid=5358 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:11.112000 audit[5362]: CRED_ACQ pid=5362 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:11.645718 sshd[5362]: Connection closed by 139.178.68.195 port 43244 Dec 16 12:20:11.646391 sshd-session[5358]: pam_unix(sshd:session): session closed for user core Dec 16 12:20:11.647000 audit[5358]: USER_END pid=5358 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:11.647000 audit[5358]: CRED_DISP pid=5358 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:11.651704 systemd[1]: sshd@13-10.0.29.66:22-139.178.68.195:43244.service: Deactivated successfully. Dec 16 12:20:11.651000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.29.66:22-139.178.68.195:43244 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:11.654075 systemd[1]: session-15.scope: Deactivated successfully. Dec 16 12:20:11.655206 systemd-logind[1646]: Session 15 logged out. Waiting for processes to exit. Dec 16 12:20:11.657147 systemd-logind[1646]: Removed session 15. Dec 16 12:20:15.112719 kubelet[2932]: E1216 12:20:15.112618 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-tp52x" podUID="1d6077b5-49a5-4d21-bd6b-0ffae41c4da0" Dec 16 12:20:16.114371 kubelet[2932]: E1216 12:20:16.114298 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7db869f9-8vr6z" podUID="01d79110-1504-420e-beb4-8a0f58f3b458" Dec 16 12:20:16.808000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.29.66:22-139.178.68.195:45822 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:16.808973 systemd[1]: Started sshd@14-10.0.29.66:22-139.178.68.195:45822.service - OpenSSH per-connection server daemon (139.178.68.195:45822). Dec 16 12:20:16.812194 kernel: kauditd_printk_skb: 23 callbacks suppressed Dec 16 12:20:16.812259 kernel: audit: type=1130 audit(1765887616.808:787): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.29.66:22-139.178.68.195:45822 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:17.111269 kubelet[2932]: E1216 12:20:17.111150 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-74bfdd549c-rv9w7" podUID="29a341e5-f159-490a-9c2d-ec19b832725b" Dec 16 12:20:17.636000 audit[5403]: USER_ACCT pid=5403 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:17.637800 sshd[5403]: Accepted publickey for core from 139.178.68.195 port 45822 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:20:17.640988 kernel: audit: type=1101 audit(1765887617.636:788): pid=5403 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:17.641080 kernel: audit: type=1103 audit(1765887617.640:789): pid=5403 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:17.640000 audit[5403]: CRED_ACQ pid=5403 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:17.641857 sshd-session[5403]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:20:17.645519 kernel: audit: type=1006 audit(1765887617.640:790): pid=5403 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Dec 16 12:20:17.640000 audit[5403]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe73b6ae0 a2=3 a3=0 items=0 ppid=1 pid=5403 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:17.646386 systemd-logind[1646]: New session 16 of user core. Dec 16 12:20:17.648769 kernel: audit: type=1300 audit(1765887617.640:790): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe73b6ae0 a2=3 a3=0 items=0 ppid=1 pid=5403 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:17.648858 kernel: audit: type=1327 audit(1765887617.640:790): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:20:17.640000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:20:17.659161 systemd[1]: Started session-16.scope - Session 16 of User core. Dec 16 12:20:17.661000 audit[5403]: USER_START pid=5403 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:17.662000 audit[5407]: CRED_ACQ pid=5407 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:17.668421 kernel: audit: type=1105 audit(1765887617.661:791): pid=5403 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:17.668511 kernel: audit: type=1103 audit(1765887617.662:792): pid=5407 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:18.186902 sshd[5407]: Connection closed by 139.178.68.195 port 45822 Dec 16 12:20:18.186931 sshd-session[5403]: pam_unix(sshd:session): session closed for user core Dec 16 12:20:18.190000 audit[5403]: USER_END pid=5403 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:18.194051 systemd[1]: sshd@14-10.0.29.66:22-139.178.68.195:45822.service: Deactivated successfully. Dec 16 12:20:18.190000 audit[5403]: CRED_DISP pid=5403 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:18.197842 kernel: audit: type=1106 audit(1765887618.190:793): pid=5403 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:18.197919 kernel: audit: type=1104 audit(1765887618.190:794): pid=5403 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:18.193000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.29.66:22-139.178.68.195:45822 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:18.198425 systemd[1]: session-16.scope: Deactivated successfully. Dec 16 12:20:18.199339 systemd-logind[1646]: Session 16 logged out. Waiting for processes to exit. Dec 16 12:20:18.200526 systemd-logind[1646]: Removed session 16. Dec 16 12:20:20.110937 kubelet[2932]: E1216 12:20:20.110851 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-khwzf" podUID="c36f8480-80e5-4b1d-b264-0fc54133bb26" Dec 16 12:20:22.112023 kubelet[2932]: E1216 12:20:22.111974 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-xvgpm" podUID="f42d8c07-697c-492a-826b-630e49b3282d" Dec 16 12:20:23.110646 kubelet[2932]: E1216 12:20:23.110573 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-8zlvg" podUID="963a1289-7bdf-4dde-a21a-b1e5da9ecfcc" Dec 16 12:20:23.353054 systemd[1]: Started sshd@15-10.0.29.66:22-139.178.68.195:48740.service - OpenSSH per-connection server daemon (139.178.68.195:48740). Dec 16 12:20:23.352000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.29.66:22-139.178.68.195:48740 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:23.354165 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:20:23.354582 kernel: audit: type=1130 audit(1765887623.352:796): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.29.66:22-139.178.68.195:48740 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:24.182284 sshd[5420]: Accepted publickey for core from 139.178.68.195 port 48740 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:20:24.181000 audit[5420]: USER_ACCT pid=5420 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:24.185892 kernel: audit: type=1101 audit(1765887624.181:797): pid=5420 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:24.185000 audit[5420]: CRED_ACQ pid=5420 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:24.187764 sshd-session[5420]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:20:24.190530 kernel: audit: type=1103 audit(1765887624.185:798): pid=5420 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:24.190608 kernel: audit: type=1006 audit(1765887624.186:799): pid=5420 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=17 res=1 Dec 16 12:20:24.190628 kernel: audit: type=1300 audit(1765887624.186:799): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc9c3a1b0 a2=3 a3=0 items=0 ppid=1 pid=5420 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:24.186000 audit[5420]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc9c3a1b0 a2=3 a3=0 items=0 ppid=1 pid=5420 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:24.186000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:20:24.195209 kernel: audit: type=1327 audit(1765887624.186:799): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:20:24.197143 systemd-logind[1646]: New session 17 of user core. Dec 16 12:20:24.207098 systemd[1]: Started session-17.scope - Session 17 of User core. Dec 16 12:20:24.209000 audit[5420]: USER_START pid=5420 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:24.211000 audit[5424]: CRED_ACQ pid=5424 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:24.216714 kernel: audit: type=1105 audit(1765887624.209:800): pid=5420 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:24.216822 kernel: audit: type=1103 audit(1765887624.211:801): pid=5424 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:24.759506 sshd[5424]: Connection closed by 139.178.68.195 port 48740 Dec 16 12:20:24.759897 sshd-session[5420]: pam_unix(sshd:session): session closed for user core Dec 16 12:20:24.760000 audit[5420]: USER_END pid=5420 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:24.760000 audit[5420]: CRED_DISP pid=5420 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:24.765835 systemd[1]: sshd@15-10.0.29.66:22-139.178.68.195:48740.service: Deactivated successfully. Dec 16 12:20:24.768088 kernel: audit: type=1106 audit(1765887624.760:802): pid=5420 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:24.768217 kernel: audit: type=1104 audit(1765887624.760:803): pid=5420 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:24.766000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.29.66:22-139.178.68.195:48740 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:24.768929 systemd[1]: session-17.scope: Deactivated successfully. Dec 16 12:20:24.769730 systemd-logind[1646]: Session 17 logged out. Waiting for processes to exit. Dec 16 12:20:24.771465 systemd-logind[1646]: Removed session 17. Dec 16 12:20:27.111826 kubelet[2932]: E1216 12:20:27.111302 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-tp52x" podUID="1d6077b5-49a5-4d21-bd6b-0ffae41c4da0" Dec 16 12:20:27.111826 kubelet[2932]: E1216 12:20:27.111416 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7db869f9-8vr6z" podUID="01d79110-1504-420e-beb4-8a0f58f3b458" Dec 16 12:20:29.927000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.29.66:22-139.178.68.195:48748 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:29.928549 systemd[1]: Started sshd@16-10.0.29.66:22-139.178.68.195:48748.service - OpenSSH per-connection server daemon (139.178.68.195:48748). Dec 16 12:20:29.929607 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:20:29.929658 kernel: audit: type=1130 audit(1765887629.927:805): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.29.66:22-139.178.68.195:48748 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:30.754000 audit[5439]: USER_ACCT pid=5439 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:30.755115 sshd[5439]: Accepted publickey for core from 139.178.68.195 port 48748 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:20:30.758219 sshd-session[5439]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:20:30.756000 audit[5439]: CRED_ACQ pid=5439 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:30.762830 kernel: audit: type=1101 audit(1765887630.754:806): pid=5439 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:30.762899 kernel: audit: type=1103 audit(1765887630.756:807): pid=5439 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:30.765132 kernel: audit: type=1006 audit(1765887630.756:808): pid=5439 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=18 res=1 Dec 16 12:20:30.756000 audit[5439]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc2943fe0 a2=3 a3=0 items=0 ppid=1 pid=5439 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:30.768890 systemd-logind[1646]: New session 18 of user core. Dec 16 12:20:30.769617 kernel: audit: type=1300 audit(1765887630.756:808): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc2943fe0 a2=3 a3=0 items=0 ppid=1 pid=5439 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:30.756000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:20:30.771838 kernel: audit: type=1327 audit(1765887630.756:808): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:20:30.775311 systemd[1]: Started session-18.scope - Session 18 of User core. Dec 16 12:20:30.777000 audit[5439]: USER_START pid=5439 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:30.779000 audit[5443]: CRED_ACQ pid=5443 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:30.786126 kernel: audit: type=1105 audit(1765887630.777:809): pid=5439 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:30.786217 kernel: audit: type=1103 audit(1765887630.779:810): pid=5443 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:31.289966 sshd[5443]: Connection closed by 139.178.68.195 port 48748 Dec 16 12:20:31.290335 sshd-session[5439]: pam_unix(sshd:session): session closed for user core Dec 16 12:20:31.291000 audit[5439]: USER_END pid=5439 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:31.294924 systemd[1]: sshd@16-10.0.29.66:22-139.178.68.195:48748.service: Deactivated successfully. Dec 16 12:20:31.291000 audit[5439]: CRED_DISP pid=5439 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:31.296953 systemd[1]: session-18.scope: Deactivated successfully. Dec 16 12:20:31.298383 kernel: audit: type=1106 audit(1765887631.291:811): pid=5439 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:31.298458 kernel: audit: type=1104 audit(1765887631.291:812): pid=5439 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:31.294000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.29.66:22-139.178.68.195:48748 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:31.300026 systemd-logind[1646]: Session 18 logged out. Waiting for processes to exit. Dec 16 12:20:31.301281 systemd-logind[1646]: Removed session 18. Dec 16 12:20:31.476724 systemd[1]: Started sshd@17-10.0.29.66:22-139.178.68.195:38322.service - OpenSSH per-connection server daemon (139.178.68.195:38322). Dec 16 12:20:31.476000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.29.66:22-139.178.68.195:38322 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:32.112177 kubelet[2932]: E1216 12:20:32.112034 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-74bfdd549c-rv9w7" podUID="29a341e5-f159-490a-9c2d-ec19b832725b" Dec 16 12:20:32.375000 audit[5457]: USER_ACCT pid=5457 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:32.376508 sshd[5457]: Accepted publickey for core from 139.178.68.195 port 38322 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:20:32.377000 audit[5457]: CRED_ACQ pid=5457 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:32.377000 audit[5457]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe31e76c0 a2=3 a3=0 items=0 ppid=1 pid=5457 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:32.377000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:20:32.378676 sshd-session[5457]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:20:32.385317 systemd-logind[1646]: New session 19 of user core. Dec 16 12:20:32.392049 systemd[1]: Started session-19.scope - Session 19 of User core. Dec 16 12:20:32.393000 audit[5457]: USER_START pid=5457 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:32.395000 audit[5463]: CRED_ACQ pid=5463 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:33.000616 sshd[5463]: Connection closed by 139.178.68.195 port 38322 Dec 16 12:20:33.001020 sshd-session[5457]: pam_unix(sshd:session): session closed for user core Dec 16 12:20:33.002000 audit[5457]: USER_END pid=5457 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:33.002000 audit[5457]: CRED_DISP pid=5457 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:33.006043 systemd[1]: sshd@17-10.0.29.66:22-139.178.68.195:38322.service: Deactivated successfully. Dec 16 12:20:33.006000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.29.66:22-139.178.68.195:38322 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:33.008482 systemd[1]: session-19.scope: Deactivated successfully. Dec 16 12:20:33.010473 systemd-logind[1646]: Session 19 logged out. Waiting for processes to exit. Dec 16 12:20:33.011322 systemd-logind[1646]: Removed session 19. Dec 16 12:20:33.170722 systemd[1]: Started sshd@18-10.0.29.66:22-139.178.68.195:38328.service - OpenSSH per-connection server daemon (139.178.68.195:38328). Dec 16 12:20:33.170000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.29.66:22-139.178.68.195:38328 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:33.988000 audit[5474]: USER_ACCT pid=5474 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:33.989593 sshd[5474]: Accepted publickey for core from 139.178.68.195 port 38328 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:20:33.990000 audit[5474]: CRED_ACQ pid=5474 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:33.990000 audit[5474]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd4713560 a2=3 a3=0 items=0 ppid=1 pid=5474 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:33.990000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:20:33.991954 sshd-session[5474]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:20:33.997917 systemd-logind[1646]: New session 20 of user core. Dec 16 12:20:34.002035 systemd[1]: Started session-20.scope - Session 20 of User core. Dec 16 12:20:34.003000 audit[5474]: USER_START pid=5474 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:34.005000 audit[5478]: CRED_ACQ pid=5478 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:34.110026 kubelet[2932]: E1216 12:20:34.109963 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-khwzf" podUID="c36f8480-80e5-4b1d-b264-0fc54133bb26" Dec 16 12:20:34.894000 audit[5493]: NETFILTER_CFG table=filter:140 family=2 entries=26 op=nft_register_rule pid=5493 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:20:34.894000 audit[5493]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=ffffff6613d0 a2=0 a3=1 items=0 ppid=3084 pid=5493 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:34.894000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:20:34.904000 audit[5493]: NETFILTER_CFG table=nat:141 family=2 entries=20 op=nft_register_rule pid=5493 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:20:34.904000 audit[5493]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffff6613d0 a2=0 a3=1 items=0 ppid=3084 pid=5493 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:34.904000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:20:35.066713 sshd[5478]: Connection closed by 139.178.68.195 port 38328 Dec 16 12:20:35.065274 sshd-session[5474]: pam_unix(sshd:session): session closed for user core Dec 16 12:20:35.068000 audit[5474]: USER_END pid=5474 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:35.073892 kernel: kauditd_printk_skb: 26 callbacks suppressed Dec 16 12:20:35.074023 kernel: audit: type=1106 audit(1765887635.068:831): pid=5474 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:35.075204 systemd[1]: sshd@18-10.0.29.66:22-139.178.68.195:38328.service: Deactivated successfully. Dec 16 12:20:35.078272 systemd[1]: session-20.scope: Deactivated successfully. Dec 16 12:20:35.068000 audit[5474]: CRED_DISP pid=5474 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:35.082928 kernel: audit: type=1104 audit(1765887635.068:832): pid=5474 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:35.085874 systemd-logind[1646]: Session 20 logged out. Waiting for processes to exit. Dec 16 12:20:35.074000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.29.66:22-139.178.68.195:38328 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:35.089215 kernel: audit: type=1131 audit(1765887635.074:833): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.29.66:22-139.178.68.195:38328 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:35.092431 systemd-logind[1646]: Removed session 20. Dec 16 12:20:35.229000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.29.66:22-139.178.68.195:38342 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:35.229827 systemd[1]: Started sshd@19-10.0.29.66:22-139.178.68.195:38342.service - OpenSSH per-connection server daemon (139.178.68.195:38342). Dec 16 12:20:35.233849 kernel: audit: type=1130 audit(1765887635.229:834): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.29.66:22-139.178.68.195:38342 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:35.921000 audit[5502]: NETFILTER_CFG table=filter:142 family=2 entries=38 op=nft_register_rule pid=5502 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:20:35.921000 audit[5502]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=ffffe617bcb0 a2=0 a3=1 items=0 ppid=3084 pid=5502 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:35.927792 kernel: audit: type=1325 audit(1765887635.921:835): table=filter:142 family=2 entries=38 op=nft_register_rule pid=5502 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:20:35.927877 kernel: audit: type=1300 audit(1765887635.921:835): arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=ffffe617bcb0 a2=0 a3=1 items=0 ppid=3084 pid=5502 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:35.927899 kernel: audit: type=1327 audit(1765887635.921:835): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:20:35.921000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:20:35.940000 audit[5502]: NETFILTER_CFG table=nat:143 family=2 entries=20 op=nft_register_rule pid=5502 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:20:35.940000 audit[5502]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffe617bcb0 a2=0 a3=1 items=0 ppid=3084 pid=5502 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:35.945921 kernel: audit: type=1325 audit(1765887635.940:836): table=nat:143 family=2 entries=20 op=nft_register_rule pid=5502 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:20:35.946044 kernel: audit: type=1300 audit(1765887635.940:836): arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffe617bcb0 a2=0 a3=1 items=0 ppid=3084 pid=5502 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:35.946092 kernel: audit: type=1327 audit(1765887635.940:836): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:20:35.940000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:20:36.057000 audit[5498]: USER_ACCT pid=5498 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:36.058720 sshd[5498]: Accepted publickey for core from 139.178.68.195 port 38342 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:20:36.058000 audit[5498]: CRED_ACQ pid=5498 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:36.058000 audit[5498]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd07f8cc0 a2=3 a3=0 items=0 ppid=1 pid=5498 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:36.058000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:20:36.060498 sshd-session[5498]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:20:36.065187 systemd-logind[1646]: New session 21 of user core. Dec 16 12:20:36.070062 systemd[1]: Started session-21.scope - Session 21 of User core. Dec 16 12:20:36.071000 audit[5498]: USER_START pid=5498 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:36.073000 audit[5504]: CRED_ACQ pid=5504 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:36.705971 sshd[5504]: Connection closed by 139.178.68.195 port 38342 Dec 16 12:20:36.707996 sshd-session[5498]: pam_unix(sshd:session): session closed for user core Dec 16 12:20:36.709000 audit[5498]: USER_END pid=5498 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:36.709000 audit[5498]: CRED_DISP pid=5498 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:36.713500 systemd[1]: sshd@19-10.0.29.66:22-139.178.68.195:38342.service: Deactivated successfully. Dec 16 12:20:36.715000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.29.66:22-139.178.68.195:38342 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:36.718490 systemd[1]: session-21.scope: Deactivated successfully. Dec 16 12:20:36.721075 systemd-logind[1646]: Session 21 logged out. Waiting for processes to exit. Dec 16 12:20:36.723339 systemd-logind[1646]: Removed session 21. Dec 16 12:20:36.872394 systemd[1]: Started sshd@20-10.0.29.66:22-139.178.68.195:38350.service - OpenSSH per-connection server daemon (139.178.68.195:38350). Dec 16 12:20:36.871000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.29.66:22-139.178.68.195:38350 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:37.113400 kubelet[2932]: E1216 12:20:37.113272 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-xvgpm" podUID="f42d8c07-697c-492a-826b-630e49b3282d" Dec 16 12:20:37.695000 audit[5516]: USER_ACCT pid=5516 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:37.696374 sshd[5516]: Accepted publickey for core from 139.178.68.195 port 38350 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:20:37.696000 audit[5516]: CRED_ACQ pid=5516 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:37.696000 audit[5516]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff92e5db0 a2=3 a3=0 items=0 ppid=1 pid=5516 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:37.696000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:20:37.698133 sshd-session[5516]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:20:37.702642 systemd-logind[1646]: New session 22 of user core. Dec 16 12:20:37.709103 systemd[1]: Started session-22.scope - Session 22 of User core. Dec 16 12:20:37.711000 audit[5516]: USER_START pid=5516 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:37.712000 audit[5520]: CRED_ACQ pid=5520 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:38.110901 kubelet[2932]: E1216 12:20:38.110741 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-8zlvg" podUID="963a1289-7bdf-4dde-a21a-b1e5da9ecfcc" Dec 16 12:20:38.112543 kubelet[2932]: E1216 12:20:38.112017 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-tp52x" podUID="1d6077b5-49a5-4d21-bd6b-0ffae41c4da0" Dec 16 12:20:38.228663 sshd[5520]: Connection closed by 139.178.68.195 port 38350 Dec 16 12:20:38.228001 sshd-session[5516]: pam_unix(sshd:session): session closed for user core Dec 16 12:20:38.228000 audit[5516]: USER_END pid=5516 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:38.229000 audit[5516]: CRED_DISP pid=5516 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:38.232339 systemd[1]: sshd@20-10.0.29.66:22-139.178.68.195:38350.service: Deactivated successfully. Dec 16 12:20:38.231000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.29.66:22-139.178.68.195:38350 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:38.234305 systemd[1]: session-22.scope: Deactivated successfully. Dec 16 12:20:38.236244 systemd-logind[1646]: Session 22 logged out. Waiting for processes to exit. Dec 16 12:20:38.237492 systemd-logind[1646]: Removed session 22. Dec 16 12:20:38.632000 audit[5535]: NETFILTER_CFG table=filter:144 family=2 entries=26 op=nft_register_rule pid=5535 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:20:38.632000 audit[5535]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffdaf101a0 a2=0 a3=1 items=0 ppid=3084 pid=5535 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:38.632000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:20:38.640000 audit[5535]: NETFILTER_CFG table=nat:145 family=2 entries=104 op=nft_register_chain pid=5535 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:20:38.640000 audit[5535]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=48684 a0=3 a1=ffffdaf101a0 a2=0 a3=1 items=0 ppid=3084 pid=5535 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:38.640000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:20:41.110937 kubelet[2932]: E1216 12:20:41.110874 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7db869f9-8vr6z" podUID="01d79110-1504-420e-beb4-8a0f58f3b458" Dec 16 12:20:43.393000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.29.66:22-139.178.68.195:46162 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:43.394160 systemd[1]: Started sshd@21-10.0.29.66:22-139.178.68.195:46162.service - OpenSSH per-connection server daemon (139.178.68.195:46162). Dec 16 12:20:43.394942 kernel: kauditd_printk_skb: 27 callbacks suppressed Dec 16 12:20:43.395045 kernel: audit: type=1130 audit(1765887643.393:856): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.29.66:22-139.178.68.195:46162 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:44.214000 audit[5563]: USER_ACCT pid=5563 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:44.215946 sshd[5563]: Accepted publickey for core from 139.178.68.195 port 46162 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:20:44.218000 audit[5563]: CRED_ACQ pid=5563 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:44.220408 sshd-session[5563]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:20:44.222397 kernel: audit: type=1101 audit(1765887644.214:857): pid=5563 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:44.222482 kernel: audit: type=1103 audit(1765887644.218:858): pid=5563 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:44.224262 kernel: audit: type=1006 audit(1765887644.218:859): pid=5563 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=23 res=1 Dec 16 12:20:44.218000 audit[5563]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcf3d8b50 a2=3 a3=0 items=0 ppid=1 pid=5563 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:44.227492 kernel: audit: type=1300 audit(1765887644.218:859): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcf3d8b50 a2=3 a3=0 items=0 ppid=1 pid=5563 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:44.227765 kernel: audit: type=1327 audit(1765887644.218:859): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:20:44.218000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:20:44.229169 systemd-logind[1646]: New session 23 of user core. Dec 16 12:20:44.237047 systemd[1]: Started session-23.scope - Session 23 of User core. Dec 16 12:20:44.238000 audit[5563]: USER_START pid=5563 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:44.242000 audit[5567]: CRED_ACQ pid=5567 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:44.246490 kernel: audit: type=1105 audit(1765887644.238:860): pid=5563 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:44.246676 kernel: audit: type=1103 audit(1765887644.242:861): pid=5567 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:44.748104 sshd[5567]: Connection closed by 139.178.68.195 port 46162 Dec 16 12:20:44.748686 sshd-session[5563]: pam_unix(sshd:session): session closed for user core Dec 16 12:20:44.749000 audit[5563]: USER_END pid=5563 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:44.749000 audit[5563]: CRED_DISP pid=5563 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:44.756053 systemd[1]: sshd@21-10.0.29.66:22-139.178.68.195:46162.service: Deactivated successfully. Dec 16 12:20:44.758010 kernel: audit: type=1106 audit(1765887644.749:862): pid=5563 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:44.758180 kernel: audit: type=1104 audit(1765887644.749:863): pid=5563 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:44.754000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.29.66:22-139.178.68.195:46162 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:44.758613 systemd[1]: session-23.scope: Deactivated successfully. Dec 16 12:20:44.761028 systemd-logind[1646]: Session 23 logged out. Waiting for processes to exit. Dec 16 12:20:44.762710 systemd-logind[1646]: Removed session 23. Dec 16 12:20:45.110906 kubelet[2932]: E1216 12:20:45.110762 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-74bfdd549c-rv9w7" podUID="29a341e5-f159-490a-9c2d-ec19b832725b" Dec 16 12:20:49.111148 kubelet[2932]: E1216 12:20:49.111084 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-khwzf" podUID="c36f8480-80e5-4b1d-b264-0fc54133bb26" Dec 16 12:20:49.916000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.29.66:22-139.178.68.195:46164 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:49.917651 systemd[1]: Started sshd@22-10.0.29.66:22-139.178.68.195:46164.service - OpenSSH per-connection server daemon (139.178.68.195:46164). Dec 16 12:20:49.918387 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:20:49.918434 kernel: audit: type=1130 audit(1765887649.916:865): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.29.66:22-139.178.68.195:46164 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:50.738000 audit[5580]: USER_ACCT pid=5580 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:50.739740 sshd[5580]: Accepted publickey for core from 139.178.68.195 port 46164 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:20:50.742222 sshd-session[5580]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:20:50.738000 audit[5580]: CRED_ACQ pid=5580 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:50.745709 kernel: audit: type=1101 audit(1765887650.738:866): pid=5580 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:50.745775 kernel: audit: type=1103 audit(1765887650.738:867): pid=5580 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:50.745797 kernel: audit: type=1006 audit(1765887650.738:868): pid=5580 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=24 res=1 Dec 16 12:20:50.738000 audit[5580]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe5e60830 a2=3 a3=0 items=0 ppid=1 pid=5580 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:50.747545 systemd-logind[1646]: New session 24 of user core. Dec 16 12:20:50.750646 kernel: audit: type=1300 audit(1765887650.738:868): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe5e60830 a2=3 a3=0 items=0 ppid=1 pid=5580 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:50.750728 kernel: audit: type=1327 audit(1765887650.738:868): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:20:50.738000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:20:50.759059 systemd[1]: Started session-24.scope - Session 24 of User core. Dec 16 12:20:50.762000 audit[5580]: USER_START pid=5580 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:50.764000 audit[5584]: CRED_ACQ pid=5584 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:50.770023 kernel: audit: type=1105 audit(1765887650.762:869): pid=5580 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:50.770155 kernel: audit: type=1103 audit(1765887650.764:870): pid=5584 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:51.112066 kubelet[2932]: E1216 12:20:51.111949 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-tp52x" podUID="1d6077b5-49a5-4d21-bd6b-0ffae41c4da0" Dec 16 12:20:51.277415 sshd[5584]: Connection closed by 139.178.68.195 port 46164 Dec 16 12:20:51.277330 sshd-session[5580]: pam_unix(sshd:session): session closed for user core Dec 16 12:20:51.277000 audit[5580]: USER_END pid=5580 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:51.277000 audit[5580]: CRED_DISP pid=5580 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:51.282527 systemd[1]: sshd@22-10.0.29.66:22-139.178.68.195:46164.service: Deactivated successfully. Dec 16 12:20:51.285135 kernel: audit: type=1106 audit(1765887651.277:871): pid=5580 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:51.285222 kernel: audit: type=1104 audit(1765887651.277:872): pid=5580 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:51.281000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.29.66:22-139.178.68.195:46164 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:51.285440 systemd[1]: session-24.scope: Deactivated successfully. Dec 16 12:20:51.287270 systemd-logind[1646]: Session 24 logged out. Waiting for processes to exit. Dec 16 12:20:51.288409 systemd-logind[1646]: Removed session 24. Dec 16 12:20:52.111084 kubelet[2932]: E1216 12:20:52.110692 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-xvgpm" podUID="f42d8c07-697c-492a-826b-630e49b3282d" Dec 16 12:20:53.110768 kubelet[2932]: E1216 12:20:53.110423 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-8zlvg" podUID="963a1289-7bdf-4dde-a21a-b1e5da9ecfcc" Dec 16 12:20:53.111926 kubelet[2932]: E1216 12:20:53.111125 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7db869f9-8vr6z" podUID="01d79110-1504-420e-beb4-8a0f58f3b458" Dec 16 12:20:56.448000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.29.66:22-139.178.68.195:51224 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:56.449470 systemd[1]: Started sshd@23-10.0.29.66:22-139.178.68.195:51224.service - OpenSSH per-connection server daemon (139.178.68.195:51224). Dec 16 12:20:56.450524 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:20:56.450572 kernel: audit: type=1130 audit(1765887656.448:874): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.29.66:22-139.178.68.195:51224 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:57.285000 audit[5604]: USER_ACCT pid=5604 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:57.286642 sshd[5604]: Accepted publickey for core from 139.178.68.195 port 51224 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:20:57.289847 kernel: audit: type=1101 audit(1765887657.285:875): pid=5604 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:57.289000 audit[5604]: CRED_ACQ pid=5604 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:57.291268 sshd-session[5604]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:20:57.293928 kernel: audit: type=1103 audit(1765887657.289:876): pid=5604 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:57.294017 kernel: audit: type=1006 audit(1765887657.289:877): pid=5604 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=25 res=1 Dec 16 12:20:57.289000 audit[5604]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff97bed90 a2=3 a3=0 items=0 ppid=1 pid=5604 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:57.299125 kernel: audit: type=1300 audit(1765887657.289:877): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff97bed90 a2=3 a3=0 items=0 ppid=1 pid=5604 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:20:57.289000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:20:57.301107 kernel: audit: type=1327 audit(1765887657.289:877): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:20:57.303452 systemd-logind[1646]: New session 25 of user core. Dec 16 12:20:57.310055 systemd[1]: Started session-25.scope - Session 25 of User core. Dec 16 12:20:57.313000 audit[5604]: USER_START pid=5604 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:57.317000 audit[5608]: CRED_ACQ pid=5608 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:57.321113 kernel: audit: type=1105 audit(1765887657.313:878): pid=5604 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:57.321218 kernel: audit: type=1103 audit(1765887657.317:879): pid=5608 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:57.837635 sshd[5608]: Connection closed by 139.178.68.195 port 51224 Dec 16 12:20:57.837398 sshd-session[5604]: pam_unix(sshd:session): session closed for user core Dec 16 12:20:57.839000 audit[5604]: USER_END pid=5604 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:57.844667 systemd[1]: sshd@23-10.0.29.66:22-139.178.68.195:51224.service: Deactivated successfully. Dec 16 12:20:57.840000 audit[5604]: CRED_DISP pid=5604 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:57.847459 kernel: audit: type=1106 audit(1765887657.839:880): pid=5604 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:57.847529 kernel: audit: type=1104 audit(1765887657.840:881): pid=5604 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:20:57.846000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.29.66:22-139.178.68.195:51224 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:20:57.849397 systemd[1]: session-25.scope: Deactivated successfully. Dec 16 12:20:57.850271 systemd-logind[1646]: Session 25 logged out. Waiting for processes to exit. Dec 16 12:20:57.852393 systemd-logind[1646]: Removed session 25. Dec 16 12:20:59.110417 kubelet[2932]: E1216 12:20:59.110359 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-74bfdd549c-rv9w7" podUID="29a341e5-f159-490a-9c2d-ec19b832725b" Dec 16 12:21:00.110567 kubelet[2932]: E1216 12:21:00.110490 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-khwzf" podUID="c36f8480-80e5-4b1d-b264-0fc54133bb26" Dec 16 12:21:03.031408 systemd[1]: Started sshd@24-10.0.29.66:22-139.178.68.195:38424.service - OpenSSH per-connection server daemon (139.178.68.195:38424). Dec 16 12:21:03.030000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.29.66:22-139.178.68.195:38424 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:21:03.032615 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:21:03.032718 kernel: audit: type=1130 audit(1765887663.030:883): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.29.66:22-139.178.68.195:38424 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:21:03.111027 kubelet[2932]: E1216 12:21:03.110963 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-xvgpm" podUID="f42d8c07-697c-492a-826b-630e49b3282d" Dec 16 12:21:03.949000 audit[5623]: USER_ACCT pid=5623 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:21:03.950476 sshd[5623]: Accepted publickey for core from 139.178.68.195 port 38424 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:21:03.953863 kernel: audit: type=1101 audit(1765887663.949:884): pid=5623 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:21:03.954000 audit[5623]: CRED_ACQ pid=5623 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:21:03.959123 sshd-session[5623]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:21:03.961224 kernel: audit: type=1103 audit(1765887663.954:885): pid=5623 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:21:03.961376 kernel: audit: type=1006 audit(1765887663.957:886): pid=5623 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=26 res=1 Dec 16 12:21:03.961400 kernel: audit: type=1300 audit(1765887663.957:886): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe7c4c730 a2=3 a3=0 items=0 ppid=1 pid=5623 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:03.957000 audit[5623]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe7c4c730 a2=3 a3=0 items=0 ppid=1 pid=5623 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=26 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:03.957000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:21:03.966562 kernel: audit: type=1327 audit(1765887663.957:886): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:21:03.970886 systemd-logind[1646]: New session 26 of user core. Dec 16 12:21:03.982248 systemd[1]: Started session-26.scope - Session 26 of User core. Dec 16 12:21:03.985000 audit[5623]: USER_START pid=5623 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:21:03.990857 kernel: audit: type=1105 audit(1765887663.985:887): pid=5623 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:21:03.990000 audit[5627]: CRED_ACQ pid=5627 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:21:03.994827 kernel: audit: type=1103 audit(1765887663.990:888): pid=5627 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:21:04.110956 kubelet[2932]: E1216 12:21:04.110455 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7db869f9-8vr6z" podUID="01d79110-1504-420e-beb4-8a0f58f3b458" Dec 16 12:21:04.551337 sshd[5627]: Connection closed by 139.178.68.195 port 38424 Dec 16 12:21:04.551650 sshd-session[5623]: pam_unix(sshd:session): session closed for user core Dec 16 12:21:04.552000 audit[5623]: USER_END pid=5623 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:21:04.555294 systemd[1]: sshd@24-10.0.29.66:22-139.178.68.195:38424.service: Deactivated successfully. Dec 16 12:21:04.557100 systemd[1]: session-26.scope: Deactivated successfully. Dec 16 12:21:04.552000 audit[5623]: CRED_DISP pid=5623 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:21:04.559015 systemd-logind[1646]: Session 26 logged out. Waiting for processes to exit. Dec 16 12:21:04.559997 systemd-logind[1646]: Removed session 26. Dec 16 12:21:04.560542 kernel: audit: type=1106 audit(1765887664.552:889): pid=5623 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:21:04.560667 kernel: audit: type=1104 audit(1765887664.552:890): pid=5623 uid=0 auid=500 ses=26 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:21:04.554000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@24-10.0.29.66:22-139.178.68.195:38424 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:21:05.111013 kubelet[2932]: E1216 12:21:05.110918 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-tp52x" podUID="1d6077b5-49a5-4d21-bd6b-0ffae41c4da0" Dec 16 12:21:07.111409 containerd[1662]: time="2025-12-16T12:21:07.111366292Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:21:07.460715 containerd[1662]: time="2025-12-16T12:21:07.460665211Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:21:07.462853 containerd[1662]: time="2025-12-16T12:21:07.462803138Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:21:07.463025 containerd[1662]: time="2025-12-16T12:21:07.462833898Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:21:07.463171 kubelet[2932]: E1216 12:21:07.463125 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:21:07.464089 kubelet[2932]: E1216 12:21:07.463173 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:21:07.464089 kubelet[2932]: E1216 12:21:07.463300 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-5d558c8677-8zlvg_calico-apiserver(963a1289-7bdf-4dde-a21a-b1e5da9ecfcc): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:21:07.464089 kubelet[2932]: E1216 12:21:07.463368 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-8zlvg" podUID="963a1289-7bdf-4dde-a21a-b1e5da9ecfcc" Dec 16 12:21:11.111605 containerd[1662]: time="2025-12-16T12:21:11.111561625Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:21:11.437688 containerd[1662]: time="2025-12-16T12:21:11.437619949Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:21:11.439347 containerd[1662]: time="2025-12-16T12:21:11.439256874Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:21:11.439412 containerd[1662]: time="2025-12-16T12:21:11.439343234Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:21:11.439611 kubelet[2932]: E1216 12:21:11.439571 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:21:11.440301 kubelet[2932]: E1216 12:21:11.439621 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:21:11.440301 kubelet[2932]: E1216 12:21:11.439694 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-74bfdd549c-rv9w7_calico-system(29a341e5-f159-490a-9c2d-ec19b832725b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:21:11.440547 containerd[1662]: time="2025-12-16T12:21:11.440511158Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:21:11.683863 update_engine[1649]: I20251216 12:21:11.683508 1649 prefs.cc:52] certificate-report-to-send-update not present in /var/lib/update_engine/prefs Dec 16 12:21:11.683863 update_engine[1649]: I20251216 12:21:11.683557 1649 prefs.cc:52] certificate-report-to-send-download not present in /var/lib/update_engine/prefs Dec 16 12:21:11.683863 update_engine[1649]: I20251216 12:21:11.683784 1649 prefs.cc:52] aleph-version not present in /var/lib/update_engine/prefs Dec 16 12:21:11.684235 update_engine[1649]: I20251216 12:21:11.684146 1649 omaha_request_params.cc:62] Current group set to alpha Dec 16 12:21:11.684267 update_engine[1649]: I20251216 12:21:11.684238 1649 update_attempter.cc:499] Already updated boot flags. Skipping. Dec 16 12:21:11.684267 update_engine[1649]: I20251216 12:21:11.684247 1649 update_attempter.cc:643] Scheduling an action processor start. Dec 16 12:21:11.684307 update_engine[1649]: I20251216 12:21:11.684264 1649 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Dec 16 12:21:11.684510 update_engine[1649]: I20251216 12:21:11.684473 1649 prefs.cc:52] previous-version not present in /var/lib/update_engine/prefs Dec 16 12:21:11.684547 update_engine[1649]: I20251216 12:21:11.684530 1649 omaha_request_action.cc:271] Posting an Omaha request to disabled Dec 16 12:21:11.684547 update_engine[1649]: I20251216 12:21:11.684538 1649 omaha_request_action.cc:272] Request: Dec 16 12:21:11.684547 update_engine[1649]: Dec 16 12:21:11.684547 update_engine[1649]: Dec 16 12:21:11.684547 update_engine[1649]: Dec 16 12:21:11.684547 update_engine[1649]: Dec 16 12:21:11.684547 update_engine[1649]: Dec 16 12:21:11.684547 update_engine[1649]: Dec 16 12:21:11.684547 update_engine[1649]: Dec 16 12:21:11.684547 update_engine[1649]: Dec 16 12:21:11.684547 update_engine[1649]: I20251216 12:21:11.684543 1649 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 16 12:21:11.684770 locksmithd[1697]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_CHECKING_FOR_UPDATE" NewVersion=0.0.0 NewSize=0 Dec 16 12:21:11.686351 update_engine[1649]: I20251216 12:21:11.686313 1649 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 16 12:21:11.687057 update_engine[1649]: I20251216 12:21:11.687008 1649 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 16 12:21:11.694480 update_engine[1649]: E20251216 12:21:11.694365 1649 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Dec 16 12:21:11.694480 update_engine[1649]: I20251216 12:21:11.694447 1649 libcurl_http_fetcher.cc:283] No HTTP response, retry 1 Dec 16 12:21:11.780435 containerd[1662]: time="2025-12-16T12:21:11.780265966Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:21:11.782090 containerd[1662]: time="2025-12-16T12:21:11.781989052Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:21:11.782090 containerd[1662]: time="2025-12-16T12:21:11.782025132Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:21:11.782270 kubelet[2932]: E1216 12:21:11.782231 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:21:11.782315 kubelet[2932]: E1216 12:21:11.782282 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:21:11.782398 kubelet[2932]: E1216 12:21:11.782381 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-74bfdd549c-rv9w7_calico-system(29a341e5-f159-490a-9c2d-ec19b832725b): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:21:11.782654 kubelet[2932]: E1216 12:21:11.782627 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-74bfdd549c-rv9w7" podUID="29a341e5-f159-490a-9c2d-ec19b832725b" Dec 16 12:21:12.111769 kubelet[2932]: E1216 12:21:12.110799 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-khwzf" podUID="c36f8480-80e5-4b1d-b264-0fc54133bb26" Dec 16 12:21:14.112016 containerd[1662]: time="2025-12-16T12:21:14.111975755Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:21:14.450501 containerd[1662]: time="2025-12-16T12:21:14.450407239Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:21:14.451877 containerd[1662]: time="2025-12-16T12:21:14.451835723Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:21:14.451877 containerd[1662]: time="2025-12-16T12:21:14.451904324Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:21:14.452086 kubelet[2932]: E1216 12:21:14.452047 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:21:14.452370 kubelet[2932]: E1216 12:21:14.452096 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:21:14.452715 kubelet[2932]: E1216 12:21:14.452189 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-xvgpm_calico-system(f42d8c07-697c-492a-826b-630e49b3282d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:21:14.452715 kubelet[2932]: E1216 12:21:14.452637 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-xvgpm" podUID="f42d8c07-697c-492a-826b-630e49b3282d" Dec 16 12:21:19.111175 kubelet[2932]: E1216 12:21:19.111118 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-8zlvg" podUID="963a1289-7bdf-4dde-a21a-b1e5da9ecfcc" Dec 16 12:21:19.111589 containerd[1662]: time="2025-12-16T12:21:19.111444768Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:21:19.456840 containerd[1662]: time="2025-12-16T12:21:19.456775034Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:21:19.458079 containerd[1662]: time="2025-12-16T12:21:19.457992438Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:21:19.458155 containerd[1662]: time="2025-12-16T12:21:19.458099118Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:21:19.458321 kubelet[2932]: E1216 12:21:19.458265 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:21:19.458321 kubelet[2932]: E1216 12:21:19.458315 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:21:19.458540 kubelet[2932]: E1216 12:21:19.458499 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-7db869f9-8vr6z_calico-system(01d79110-1504-420e-beb4-8a0f58f3b458): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:21:19.458587 kubelet[2932]: E1216 12:21:19.458552 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7db869f9-8vr6z" podUID="01d79110-1504-420e-beb4-8a0f58f3b458" Dec 16 12:21:19.458770 containerd[1662]: time="2025-12-16T12:21:19.458744360Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:21:19.789152 containerd[1662]: time="2025-12-16T12:21:19.789024058Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:21:19.791459 containerd[1662]: time="2025-12-16T12:21:19.791399586Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:21:19.791548 containerd[1662]: time="2025-12-16T12:21:19.791487026Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:21:19.791703 kubelet[2932]: E1216 12:21:19.791666 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:21:19.791804 kubelet[2932]: E1216 12:21:19.791788 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:21:19.791964 kubelet[2932]: E1216 12:21:19.791945 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-tp52x_calico-system(1d6077b5-49a5-4d21-bd6b-0ffae41c4da0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:21:19.792886 containerd[1662]: time="2025-12-16T12:21:19.792862190Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:21:20.130689 containerd[1662]: time="2025-12-16T12:21:20.130633752Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:21:20.132208 containerd[1662]: time="2025-12-16T12:21:20.132162757Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:21:20.132295 containerd[1662]: time="2025-12-16T12:21:20.132230077Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:21:20.132435 kubelet[2932]: E1216 12:21:20.132397 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:21:20.132676 kubelet[2932]: E1216 12:21:20.132447 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:21:20.132676 kubelet[2932]: E1216 12:21:20.132519 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-tp52x_calico-system(1d6077b5-49a5-4d21-bd6b-0ffae41c4da0): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:21:20.132676 kubelet[2932]: E1216 12:21:20.132557 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-tp52x" podUID="1d6077b5-49a5-4d21-bd6b-0ffae41c4da0" Dec 16 12:21:21.685519 update_engine[1649]: I20251216 12:21:21.685441 1649 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 16 12:21:21.685889 update_engine[1649]: I20251216 12:21:21.685539 1649 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 16 12:21:21.685921 update_engine[1649]: I20251216 12:21:21.685894 1649 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 16 12:21:21.692560 update_engine[1649]: E20251216 12:21:21.692482 1649 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Dec 16 12:21:21.692723 update_engine[1649]: I20251216 12:21:21.692584 1649 libcurl_http_fetcher.cc:283] No HTTP response, retry 2 Dec 16 12:21:23.111304 kubelet[2932]: E1216 12:21:23.111242 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-74bfdd549c-rv9w7" podUID="29a341e5-f159-490a-9c2d-ec19b832725b" Dec 16 12:21:24.110692 containerd[1662]: time="2025-12-16T12:21:24.110587860Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:21:24.441632 containerd[1662]: time="2025-12-16T12:21:24.441516960Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:21:24.443197 containerd[1662]: time="2025-12-16T12:21:24.443153725Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:21:24.443287 containerd[1662]: time="2025-12-16T12:21:24.443232685Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:21:24.443428 kubelet[2932]: E1216 12:21:24.443390 2932 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:21:24.443682 kubelet[2932]: E1216 12:21:24.443440 2932 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:21:24.443682 kubelet[2932]: E1216 12:21:24.443514 2932 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-5d558c8677-khwzf_calico-apiserver(c36f8480-80e5-4b1d-b264-0fc54133bb26): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:21:24.443682 kubelet[2932]: E1216 12:21:24.443544 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-khwzf" podUID="c36f8480-80e5-4b1d-b264-0fc54133bb26" Dec 16 12:21:27.110326 kubelet[2932]: E1216 12:21:27.110283 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-xvgpm" podUID="f42d8c07-697c-492a-826b-630e49b3282d" Dec 16 12:21:29.352386 systemd[1]: cri-containerd-abcce0cf88f246b5c4a93f450c503fb28c508b775f7954abf3f7c6748cfcb1ec.scope: Deactivated successfully. Dec 16 12:21:29.354702 systemd[1]: cri-containerd-abcce0cf88f246b5c4a93f450c503fb28c508b775f7954abf3f7c6748cfcb1ec.scope: Consumed 4.299s CPU time, 67.3M memory peak. Dec 16 12:21:29.355089 containerd[1662]: time="2025-12-16T12:21:29.354637776Z" level=info msg="received container exit event container_id:\"abcce0cf88f246b5c4a93f450c503fb28c508b775f7954abf3f7c6748cfcb1ec\" id:\"abcce0cf88f246b5c4a93f450c503fb28c508b775f7954abf3f7c6748cfcb1ec\" pid:2758 exit_status:1 exited_at:{seconds:1765887689 nanos:354388655}" Dec 16 12:21:29.357207 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:21:29.357286 kernel: audit: type=1334 audit(1765887689.354:892): prog-id=256 op=LOAD Dec 16 12:21:29.354000 audit: BPF prog-id=256 op=LOAD Dec 16 12:21:29.354000 audit: BPF prog-id=83 op=UNLOAD Dec 16 12:21:29.358655 kernel: audit: type=1334 audit(1765887689.354:893): prog-id=83 op=UNLOAD Dec 16 12:21:29.356000 audit: BPF prog-id=98 op=UNLOAD Dec 16 12:21:29.359626 kernel: audit: type=1334 audit(1765887689.356:894): prog-id=98 op=UNLOAD Dec 16 12:21:29.356000 audit: BPF prog-id=102 op=UNLOAD Dec 16 12:21:29.360585 kernel: audit: type=1334 audit(1765887689.356:895): prog-id=102 op=UNLOAD Dec 16 12:21:29.377580 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-abcce0cf88f246b5c4a93f450c503fb28c508b775f7954abf3f7c6748cfcb1ec-rootfs.mount: Deactivated successfully. Dec 16 12:21:29.707316 kubelet[2932]: I1216 12:21:29.707265 2932 scope.go:117] "RemoveContainer" containerID="abcce0cf88f246b5c4a93f450c503fb28c508b775f7954abf3f7c6748cfcb1ec" Dec 16 12:21:29.709416 containerd[1662]: time="2025-12-16T12:21:29.709376432Z" level=info msg="CreateContainer within sandbox \"667f874a3429dacbfb72c1142c1aa79835cb9591b9ff2992196f6b65e04c7ab0\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Dec 16 12:21:29.719022 containerd[1662]: time="2025-12-16T12:21:29.718971663Z" level=info msg="Container a4ad5bac8ae7b51fd224a8c7fc1507e28ccd854b4fb9a9c366f197a6f6cbacc4: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:21:29.726561 containerd[1662]: time="2025-12-16T12:21:29.726517887Z" level=info msg="CreateContainer within sandbox \"667f874a3429dacbfb72c1142c1aa79835cb9591b9ff2992196f6b65e04c7ab0\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"a4ad5bac8ae7b51fd224a8c7fc1507e28ccd854b4fb9a9c366f197a6f6cbacc4\"" Dec 16 12:21:29.727099 containerd[1662]: time="2025-12-16T12:21:29.727059449Z" level=info msg="StartContainer for \"a4ad5bac8ae7b51fd224a8c7fc1507e28ccd854b4fb9a9c366f197a6f6cbacc4\"" Dec 16 12:21:29.728322 containerd[1662]: time="2025-12-16T12:21:29.728295453Z" level=info msg="connecting to shim a4ad5bac8ae7b51fd224a8c7fc1507e28ccd854b4fb9a9c366f197a6f6cbacc4" address="unix:///run/containerd/s/c8e23ec280ca2b97526a4348fe9aab16703804e0b2f89a273a970491b143fb6a" protocol=ttrpc version=3 Dec 16 12:21:29.752076 systemd[1]: Started cri-containerd-a4ad5bac8ae7b51fd224a8c7fc1507e28ccd854b4fb9a9c366f197a6f6cbacc4.scope - libcontainer container a4ad5bac8ae7b51fd224a8c7fc1507e28ccd854b4fb9a9c366f197a6f6cbacc4. Dec 16 12:21:29.763000 audit: BPF prog-id=257 op=LOAD Dec 16 12:21:29.764000 audit: BPF prog-id=258 op=LOAD Dec 16 12:21:29.766978 kernel: audit: type=1334 audit(1765887689.763:896): prog-id=257 op=LOAD Dec 16 12:21:29.767045 kernel: audit: type=1334 audit(1765887689.764:897): prog-id=258 op=LOAD Dec 16 12:21:29.767075 kernel: audit: type=1300 audit(1765887689.764:897): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2605 pid=5699 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:29.764000 audit[5699]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2605 pid=5699 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:29.764000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134616435626163386165376235316664323234613863376663313530 Dec 16 12:21:29.773719 kernel: audit: type=1327 audit(1765887689.764:897): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134616435626163386165376235316664323234613863376663313530 Dec 16 12:21:29.764000 audit: BPF prog-id=258 op=UNLOAD Dec 16 12:21:29.775052 kernel: audit: type=1334 audit(1765887689.764:898): prog-id=258 op=UNLOAD Dec 16 12:21:29.764000 audit[5699]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2605 pid=5699 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:29.778081 kernel: audit: type=1300 audit(1765887689.764:898): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2605 pid=5699 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:29.764000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134616435626163386165376235316664323234613863376663313530 Dec 16 12:21:29.764000 audit: BPF prog-id=259 op=LOAD Dec 16 12:21:29.764000 audit[5699]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2605 pid=5699 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:29.764000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134616435626163386165376235316664323234613863376663313530 Dec 16 12:21:29.765000 audit: BPF prog-id=260 op=LOAD Dec 16 12:21:29.765000 audit[5699]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2605 pid=5699 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:29.765000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134616435626163386165376235316664323234613863376663313530 Dec 16 12:21:29.765000 audit: BPF prog-id=260 op=UNLOAD Dec 16 12:21:29.765000 audit[5699]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2605 pid=5699 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:29.765000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134616435626163386165376235316664323234613863376663313530 Dec 16 12:21:29.765000 audit: BPF prog-id=259 op=UNLOAD Dec 16 12:21:29.765000 audit[5699]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2605 pid=5699 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:29.765000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134616435626163386165376235316664323234613863376663313530 Dec 16 12:21:29.765000 audit: BPF prog-id=261 op=LOAD Dec 16 12:21:29.765000 audit[5699]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2605 pid=5699 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:29.765000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6134616435626163386165376235316664323234613863376663313530 Dec 16 12:21:29.781921 kubelet[2932]: E1216 12:21:29.781851 2932 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.29.66:49616->10.0.29.122:2379: read: connection timed out" Dec 16 12:21:29.800359 containerd[1662]: time="2025-12-16T12:21:29.800321764Z" level=info msg="StartContainer for \"a4ad5bac8ae7b51fd224a8c7fc1507e28ccd854b4fb9a9c366f197a6f6cbacc4\" returns successfully" Dec 16 12:21:30.110103 kubelet[2932]: E1216 12:21:30.109975 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7db869f9-8vr6z" podUID="01d79110-1504-420e-beb4-8a0f58f3b458" Dec 16 12:21:31.104707 systemd[1]: cri-containerd-1ea377966023efa09fdf843774669037373183530e90cbfc3bbd485b58f2c7c4.scope: Deactivated successfully. Dec 16 12:21:31.105150 systemd[1]: cri-containerd-1ea377966023efa09fdf843774669037373183530e90cbfc3bbd485b58f2c7c4.scope: Consumed 38.664s CPU time, 122.3M memory peak. Dec 16 12:21:31.108000 audit: BPF prog-id=146 op=UNLOAD Dec 16 12:21:31.108000 audit: BPF prog-id=150 op=UNLOAD Dec 16 12:21:31.110171 containerd[1662]: time="2025-12-16T12:21:31.107951912Z" level=info msg="received container exit event container_id:\"1ea377966023efa09fdf843774669037373183530e90cbfc3bbd485b58f2c7c4\" id:\"1ea377966023efa09fdf843774669037373183530e90cbfc3bbd485b58f2c7c4\" pid:3263 exit_status:1 exited_at:{seconds:1765887691 nanos:107637951}" Dec 16 12:21:31.112418 kubelet[2932]: E1216 12:21:31.112313 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-8zlvg" podUID="963a1289-7bdf-4dde-a21a-b1e5da9ecfcc" Dec 16 12:21:31.136144 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-1ea377966023efa09fdf843774669037373183530e90cbfc3bbd485b58f2c7c4-rootfs.mount: Deactivated successfully. Dec 16 12:21:31.678581 update_engine[1649]: I20251216 12:21:31.678483 1649 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 16 12:21:31.678581 update_engine[1649]: I20251216 12:21:31.678591 1649 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 16 12:21:31.679120 update_engine[1649]: I20251216 12:21:31.679070 1649 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 16 12:21:31.685132 update_engine[1649]: E20251216 12:21:31.685068 1649 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Dec 16 12:21:31.685264 update_engine[1649]: I20251216 12:21:31.685161 1649 libcurl_http_fetcher.cc:283] No HTTP response, retry 3 Dec 16 12:21:31.715684 kubelet[2932]: I1216 12:21:31.715480 2932 scope.go:117] "RemoveContainer" containerID="1ea377966023efa09fdf843774669037373183530e90cbfc3bbd485b58f2c7c4" Dec 16 12:21:31.717113 containerd[1662]: time="2025-12-16T12:21:31.717076583Z" level=info msg="CreateContainer within sandbox \"5640012fb8de3586741da1326b5f872a7599db377180951bb739df9a954974c5\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Dec 16 12:21:31.729334 containerd[1662]: time="2025-12-16T12:21:31.728854741Z" level=info msg="Container de378ced7c8ee4c4b0dbf69249b36b043d62f1603a803037bd5323eac6d981ae: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:21:31.735403 containerd[1662]: time="2025-12-16T12:21:31.735348522Z" level=info msg="CreateContainer within sandbox \"5640012fb8de3586741da1326b5f872a7599db377180951bb739df9a954974c5\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"de378ced7c8ee4c4b0dbf69249b36b043d62f1603a803037bd5323eac6d981ae\"" Dec 16 12:21:31.735879 containerd[1662]: time="2025-12-16T12:21:31.735852803Z" level=info msg="StartContainer for \"de378ced7c8ee4c4b0dbf69249b36b043d62f1603a803037bd5323eac6d981ae\"" Dec 16 12:21:31.736770 containerd[1662]: time="2025-12-16T12:21:31.736710206Z" level=info msg="connecting to shim de378ced7c8ee4c4b0dbf69249b36b043d62f1603a803037bd5323eac6d981ae" address="unix:///run/containerd/s/69ff2fc689f8a8df5d9a3a4e3c3a262c789426956546e950d417f472c72c1163" protocol=ttrpc version=3 Dec 16 12:21:31.761109 systemd[1]: Started cri-containerd-de378ced7c8ee4c4b0dbf69249b36b043d62f1603a803037bd5323eac6d981ae.scope - libcontainer container de378ced7c8ee4c4b0dbf69249b36b043d62f1603a803037bd5323eac6d981ae. Dec 16 12:21:31.774000 audit: BPF prog-id=262 op=LOAD Dec 16 12:21:31.774000 audit: BPF prog-id=263 op=LOAD Dec 16 12:21:31.774000 audit[5744]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3040 pid=5744 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:31.774000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465333738636564376338656534633462306462663639323439623336 Dec 16 12:21:31.774000 audit: BPF prog-id=263 op=UNLOAD Dec 16 12:21:31.774000 audit[5744]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3040 pid=5744 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:31.774000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465333738636564376338656534633462306462663639323439623336 Dec 16 12:21:31.774000 audit: BPF prog-id=264 op=LOAD Dec 16 12:21:31.774000 audit[5744]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3040 pid=5744 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:31.774000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465333738636564376338656534633462306462663639323439623336 Dec 16 12:21:31.774000 audit: BPF prog-id=265 op=LOAD Dec 16 12:21:31.774000 audit[5744]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3040 pid=5744 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:31.774000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465333738636564376338656534633462306462663639323439623336 Dec 16 12:21:31.775000 audit: BPF prog-id=265 op=UNLOAD Dec 16 12:21:31.775000 audit[5744]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3040 pid=5744 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:31.775000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465333738636564376338656534633462306462663639323439623336 Dec 16 12:21:31.775000 audit: BPF prog-id=264 op=UNLOAD Dec 16 12:21:31.775000 audit[5744]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3040 pid=5744 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:31.775000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465333738636564376338656534633462306462663639323439623336 Dec 16 12:21:31.775000 audit: BPF prog-id=266 op=LOAD Dec 16 12:21:31.775000 audit[5744]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3040 pid=5744 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:31.775000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465333738636564376338656534633462306462663639323439623336 Dec 16 12:21:31.793278 containerd[1662]: time="2025-12-16T12:21:31.793238307Z" level=info msg="StartContainer for \"de378ced7c8ee4c4b0dbf69249b36b043d62f1603a803037bd5323eac6d981ae\" returns successfully" Dec 16 12:21:34.112658 kubelet[2932]: E1216 12:21:34.112338 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-tp52x" podUID="1d6077b5-49a5-4d21-bd6b-0ffae41c4da0" Dec 16 12:21:36.101940 systemd[1]: cri-containerd-10a46cef7f90c7580a77ec7626e9390580b228e9b6aecf4e3a69135da12ce63c.scope: Deactivated successfully. Dec 16 12:21:36.102321 systemd[1]: cri-containerd-10a46cef7f90c7580a77ec7626e9390580b228e9b6aecf4e3a69135da12ce63c.scope: Consumed 4.006s CPU time, 24M memory peak. Dec 16 12:21:36.102000 audit: BPF prog-id=267 op=LOAD Dec 16 12:21:36.104354 kernel: kauditd_printk_skb: 40 callbacks suppressed Dec 16 12:21:36.104418 kernel: audit: type=1334 audit(1765887696.102:914): prog-id=267 op=LOAD Dec 16 12:21:36.104445 kernel: audit: type=1334 audit(1765887696.102:915): prog-id=93 op=UNLOAD Dec 16 12:21:36.102000 audit: BPF prog-id=93 op=UNLOAD Dec 16 12:21:36.104518 containerd[1662]: time="2025-12-16T12:21:36.104389915Z" level=info msg="received container exit event container_id:\"10a46cef7f90c7580a77ec7626e9390580b228e9b6aecf4e3a69135da12ce63c\" id:\"10a46cef7f90c7580a77ec7626e9390580b228e9b6aecf4e3a69135da12ce63c\" pid:2776 exit_status:1 exited_at:{seconds:1765887696 nanos:104006354}" Dec 16 12:21:36.109000 audit: BPF prog-id=103 op=UNLOAD Dec 16 12:21:36.111559 kubelet[2932]: E1216 12:21:36.111527 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-khwzf" podUID="c36f8480-80e5-4b1d-b264-0fc54133bb26" Dec 16 12:21:36.109000 audit: BPF prog-id=107 op=UNLOAD Dec 16 12:21:36.112241 kubelet[2932]: E1216 12:21:36.112195 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-74bfdd549c-rv9w7" podUID="29a341e5-f159-490a-9c2d-ec19b832725b" Dec 16 12:21:36.112332 kernel: audit: type=1334 audit(1765887696.109:916): prog-id=103 op=UNLOAD Dec 16 12:21:36.112371 kernel: audit: type=1334 audit(1765887696.109:917): prog-id=107 op=UNLOAD Dec 16 12:21:36.129840 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-10a46cef7f90c7580a77ec7626e9390580b228e9b6aecf4e3a69135da12ce63c-rootfs.mount: Deactivated successfully. Dec 16 12:21:36.730567 kubelet[2932]: I1216 12:21:36.730535 2932 scope.go:117] "RemoveContainer" containerID="10a46cef7f90c7580a77ec7626e9390580b228e9b6aecf4e3a69135da12ce63c" Dec 16 12:21:36.732284 containerd[1662]: time="2025-12-16T12:21:36.732247606Z" level=info msg="CreateContainer within sandbox \"59c15d7387cbda02f31a180d709243bb48221642afa1da2773384a2bfab518ae\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Dec 16 12:21:36.741240 containerd[1662]: time="2025-12-16T12:21:36.741183195Z" level=info msg="Container f20d19bf8b867924a8be99b6cf325b1fb068f1ed3a32378ba4e795bb7a649af8: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:21:36.752315 containerd[1662]: time="2025-12-16T12:21:36.752162510Z" level=info msg="CreateContainer within sandbox \"59c15d7387cbda02f31a180d709243bb48221642afa1da2773384a2bfab518ae\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"f20d19bf8b867924a8be99b6cf325b1fb068f1ed3a32378ba4e795bb7a649af8\"" Dec 16 12:21:36.753646 containerd[1662]: time="2025-12-16T12:21:36.753016473Z" level=info msg="StartContainer for \"f20d19bf8b867924a8be99b6cf325b1fb068f1ed3a32378ba4e795bb7a649af8\"" Dec 16 12:21:36.754917 containerd[1662]: time="2025-12-16T12:21:36.754887039Z" level=info msg="connecting to shim f20d19bf8b867924a8be99b6cf325b1fb068f1ed3a32378ba4e795bb7a649af8" address="unix:///run/containerd/s/e4e52a02362517024b177fd16ec1546c58b0316891f3e2e8d94eae78e4bd83f0" protocol=ttrpc version=3 Dec 16 12:21:36.787071 systemd[1]: Started cri-containerd-f20d19bf8b867924a8be99b6cf325b1fb068f1ed3a32378ba4e795bb7a649af8.scope - libcontainer container f20d19bf8b867924a8be99b6cf325b1fb068f1ed3a32378ba4e795bb7a649af8. Dec 16 12:21:36.798000 audit: BPF prog-id=268 op=LOAD Dec 16 12:21:36.799000 audit: BPF prog-id=269 op=LOAD Dec 16 12:21:36.800943 kernel: audit: type=1334 audit(1765887696.798:918): prog-id=268 op=LOAD Dec 16 12:21:36.801006 kernel: audit: type=1334 audit(1765887696.799:919): prog-id=269 op=LOAD Dec 16 12:21:36.801030 kernel: audit: type=1300 audit(1765887696.799:919): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2640 pid=5791 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:36.799000 audit[5791]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2640 pid=5791 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:36.799000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632306431396266386238363739323461386265393962366366333235 Dec 16 12:21:36.806994 kernel: audit: type=1327 audit(1765887696.799:919): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632306431396266386238363739323461386265393962366366333235 Dec 16 12:21:36.807077 kernel: audit: type=1334 audit(1765887696.799:920): prog-id=269 op=UNLOAD Dec 16 12:21:36.799000 audit: BPF prog-id=269 op=UNLOAD Dec 16 12:21:36.799000 audit[5791]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2640 pid=5791 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:36.810774 kernel: audit: type=1300 audit(1765887696.799:920): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2640 pid=5791 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:36.799000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632306431396266386238363739323461386265393962366366333235 Dec 16 12:21:36.799000 audit: BPF prog-id=270 op=LOAD Dec 16 12:21:36.799000 audit[5791]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2640 pid=5791 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:36.799000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632306431396266386238363739323461386265393962366366333235 Dec 16 12:21:36.800000 audit: BPF prog-id=271 op=LOAD Dec 16 12:21:36.800000 audit[5791]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2640 pid=5791 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:36.800000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632306431396266386238363739323461386265393962366366333235 Dec 16 12:21:36.803000 audit: BPF prog-id=271 op=UNLOAD Dec 16 12:21:36.803000 audit[5791]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2640 pid=5791 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:36.803000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632306431396266386238363739323461386265393962366366333235 Dec 16 12:21:36.803000 audit: BPF prog-id=270 op=UNLOAD Dec 16 12:21:36.803000 audit[5791]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2640 pid=5791 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:36.803000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632306431396266386238363739323461386265393962366366333235 Dec 16 12:21:36.803000 audit: BPF prog-id=272 op=LOAD Dec 16 12:21:36.803000 audit[5791]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2640 pid=5791 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:21:36.803000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6632306431396266386238363739323461386265393962366366333235 Dec 16 12:21:36.836101 containerd[1662]: time="2025-12-16T12:21:36.836061379Z" level=info msg="StartContainer for \"f20d19bf8b867924a8be99b6cf325b1fb068f1ed3a32378ba4e795bb7a649af8\" returns successfully" Dec 16 12:21:39.783528 kubelet[2932]: E1216 12:21:39.783430 2932 controller.go:195] "Failed to update lease" err="the server was unable to return a response in the time allotted, but may still be processing the request (put leases.coordination.k8s.io ci-4547-0-0-5-b12717c6ea)" Dec 16 12:21:41.683252 update_engine[1649]: I20251216 12:21:41.682741 1649 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 16 12:21:41.683252 update_engine[1649]: I20251216 12:21:41.682933 1649 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 16 12:21:41.683749 update_engine[1649]: I20251216 12:21:41.683553 1649 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 16 12:21:41.690876 update_engine[1649]: E20251216 12:21:41.690826 1649 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Dec 16 12:21:41.691053 update_engine[1649]: I20251216 12:21:41.690915 1649 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Dec 16 12:21:41.691053 update_engine[1649]: I20251216 12:21:41.690927 1649 omaha_request_action.cc:617] Omaha request response: Dec 16 12:21:41.691213 update_engine[1649]: E20251216 12:21:41.691072 1649 omaha_request_action.cc:636] Omaha request network transfer failed. Dec 16 12:21:41.691213 update_engine[1649]: I20251216 12:21:41.691118 1649 action_processor.cc:68] ActionProcessor::ActionComplete: OmahaRequestAction action failed. Aborting processing. Dec 16 12:21:41.691213 update_engine[1649]: I20251216 12:21:41.691134 1649 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Dec 16 12:21:41.691213 update_engine[1649]: I20251216 12:21:41.691147 1649 update_attempter.cc:306] Processing Done. Dec 16 12:21:41.691213 update_engine[1649]: E20251216 12:21:41.691176 1649 update_attempter.cc:619] Update failed. Dec 16 12:21:41.691213 update_engine[1649]: I20251216 12:21:41.691191 1649 utils.cc:600] Converting error code 2000 to kActionCodeOmahaErrorInHTTPResponse Dec 16 12:21:41.691213 update_engine[1649]: I20251216 12:21:41.691205 1649 payload_state.cc:97] Updating payload state for error code: 37 (kActionCodeOmahaErrorInHTTPResponse) Dec 16 12:21:41.691378 update_engine[1649]: I20251216 12:21:41.691218 1649 payload_state.cc:103] Ignoring failures until we get a valid Omaha response. Dec 16 12:21:41.691378 update_engine[1649]: I20251216 12:21:41.691303 1649 action_processor.cc:36] ActionProcessor::StartProcessing: OmahaRequestAction Dec 16 12:21:41.691378 update_engine[1649]: I20251216 12:21:41.691323 1649 omaha_request_action.cc:271] Posting an Omaha request to disabled Dec 16 12:21:41.691378 update_engine[1649]: I20251216 12:21:41.691329 1649 omaha_request_action.cc:272] Request: Dec 16 12:21:41.691378 update_engine[1649]: Dec 16 12:21:41.691378 update_engine[1649]: Dec 16 12:21:41.691378 update_engine[1649]: Dec 16 12:21:41.691378 update_engine[1649]: Dec 16 12:21:41.691378 update_engine[1649]: Dec 16 12:21:41.691378 update_engine[1649]: Dec 16 12:21:41.691378 update_engine[1649]: I20251216 12:21:41.691334 1649 libcurl_http_fetcher.cc:47] Starting/Resuming transfer Dec 16 12:21:41.691378 update_engine[1649]: I20251216 12:21:41.691354 1649 libcurl_http_fetcher.cc:151] Setting up curl options for HTTP Dec 16 12:21:41.691779 update_engine[1649]: I20251216 12:21:41.691638 1649 libcurl_http_fetcher.cc:449] Setting up timeout source: 1 seconds. Dec 16 12:21:41.691933 locksmithd[1697]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_REPORTING_ERROR_EVENT" NewVersion=0.0.0 NewSize=0 Dec 16 12:21:41.697719 update_engine[1649]: E20251216 12:21:41.697669 1649 libcurl_http_fetcher.cc:266] Unable to get http response code: Could not resolve host: disabled (Domain name not found) Dec 16 12:21:41.697783 update_engine[1649]: I20251216 12:21:41.697753 1649 libcurl_http_fetcher.cc:297] Transfer resulted in an error (0), 0 bytes downloaded Dec 16 12:21:41.697783 update_engine[1649]: I20251216 12:21:41.697763 1649 omaha_request_action.cc:617] Omaha request response: Dec 16 12:21:41.697783 update_engine[1649]: I20251216 12:21:41.697770 1649 action_processor.cc:65] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Dec 16 12:21:41.697783 update_engine[1649]: I20251216 12:21:41.697775 1649 action_processor.cc:73] ActionProcessor::ActionComplete: finished last action of type OmahaRequestAction Dec 16 12:21:41.697783 update_engine[1649]: I20251216 12:21:41.697780 1649 update_attempter.cc:306] Processing Done. Dec 16 12:21:41.697783 update_engine[1649]: I20251216 12:21:41.697784 1649 update_attempter.cc:310] Error event sent. Dec 16 12:21:41.698302 update_engine[1649]: I20251216 12:21:41.697793 1649 update_check_scheduler.cc:74] Next update check in 42m16s Dec 16 12:21:41.698329 locksmithd[1697]: LastCheckedTime=0 Progress=0 CurrentOperation="UPDATE_STATUS_IDLE" NewVersion=0.0.0 NewSize=0 Dec 16 12:21:42.110335 kubelet[2932]: E1216 12:21:42.110206 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-xvgpm" podUID="f42d8c07-697c-492a-826b-630e49b3282d" Dec 16 12:21:42.110658 kubelet[2932]: E1216 12:21:42.110321 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-5d558c8677-8zlvg" podUID="963a1289-7bdf-4dde-a21a-b1e5da9ecfcc" Dec 16 12:21:42.980487 systemd[1]: cri-containerd-de378ced7c8ee4c4b0dbf69249b36b043d62f1603a803037bd5323eac6d981ae.scope: Deactivated successfully. Dec 16 12:21:42.981331 containerd[1662]: time="2025-12-16T12:21:42.981067221Z" level=info msg="received container exit event container_id:\"de378ced7c8ee4c4b0dbf69249b36b043d62f1603a803037bd5323eac6d981ae\" id:\"de378ced7c8ee4c4b0dbf69249b36b043d62f1603a803037bd5323eac6d981ae\" pid:5756 exit_status:1 exited_at:{seconds:1765887702 nanos:980840260}" Dec 16 12:21:42.984000 audit: BPF prog-id=262 op=UNLOAD Dec 16 12:21:42.986545 kernel: kauditd_printk_skb: 16 callbacks suppressed Dec 16 12:21:42.986621 kernel: audit: type=1334 audit(1765887702.984:926): prog-id=262 op=UNLOAD Dec 16 12:21:42.986643 kernel: audit: type=1334 audit(1765887702.984:927): prog-id=266 op=UNLOAD Dec 16 12:21:42.984000 audit: BPF prog-id=266 op=UNLOAD Dec 16 12:21:43.001128 systemd[1794]: Created slice background.slice - User Background Tasks Slice. Dec 16 12:21:43.001299 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-de378ced7c8ee4c4b0dbf69249b36b043d62f1603a803037bd5323eac6d981ae-rootfs.mount: Deactivated successfully. Dec 16 12:21:43.003074 systemd[1794]: Starting systemd-tmpfiles-clean.service - Cleanup of User's Temporary Files and Directories... Dec 16 12:21:43.024245 systemd[1794]: Finished systemd-tmpfiles-clean.service - Cleanup of User's Temporary Files and Directories. Dec 16 12:21:43.748799 kubelet[2932]: I1216 12:21:43.748770 2932 scope.go:117] "RemoveContainer" containerID="1ea377966023efa09fdf843774669037373183530e90cbfc3bbd485b58f2c7c4" Dec 16 12:21:43.749862 kubelet[2932]: I1216 12:21:43.749377 2932 scope.go:117] "RemoveContainer" containerID="de378ced7c8ee4c4b0dbf69249b36b043d62f1603a803037bd5323eac6d981ae" Dec 16 12:21:43.749862 kubelet[2932]: E1216 12:21:43.749548 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-65cdcdfd6d-4tjvn_tigera-operator(381898a3-7c53-40de-8c60-fd63fe146417)\"" pod="tigera-operator/tigera-operator-65cdcdfd6d-4tjvn" podUID="381898a3-7c53-40de-8c60-fd63fe146417" Dec 16 12:21:43.751132 containerd[1662]: time="2025-12-16T12:21:43.751050127Z" level=info msg="RemoveContainer for \"1ea377966023efa09fdf843774669037373183530e90cbfc3bbd485b58f2c7c4\"" Dec 16 12:21:43.757775 containerd[1662]: time="2025-12-16T12:21:43.757739069Z" level=info msg="RemoveContainer for \"1ea377966023efa09fdf843774669037373183530e90cbfc3bbd485b58f2c7c4\" returns successfully" Dec 16 12:21:45.110470 kubelet[2932]: E1216 12:21:45.110431 2932 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-7db869f9-8vr6z" podUID="01d79110-1504-420e-beb4-8a0f58f3b458" Dec 16 12:21:45.667886 kernel: pcieport 0000:00:01.0: pciehp: Slot(0): Button press: will power off in 5 sec