Dec 16 12:14:16.383738 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Dec 16 12:14:16.383760 kernel: Linux version 6.12.61-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT Tue Dec 16 00:05:24 -00 2025 Dec 16 12:14:16.383771 kernel: KASLR enabled Dec 16 12:14:16.383776 kernel: efi: EFI v2.7 by EDK II Dec 16 12:14:16.383782 kernel: efi: SMBIOS 3.0=0x43bed0000 MEMATTR=0x43a714018 ACPI 2.0=0x438430018 RNG=0x43843e818 MEMRESERVE=0x438351218 Dec 16 12:14:16.383788 kernel: random: crng init done Dec 16 12:14:16.383795 kernel: secureboot: Secure boot disabled Dec 16 12:14:16.383801 kernel: ACPI: Early table checksum verification disabled Dec 16 12:14:16.383807 kernel: ACPI: RSDP 0x0000000438430018 000024 (v02 BOCHS ) Dec 16 12:14:16.383814 kernel: ACPI: XSDT 0x000000043843FE98 000074 (v01 BOCHS BXPC 00000001 01000013) Dec 16 12:14:16.383820 kernel: ACPI: FACP 0x000000043843FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:14:16.383826 kernel: ACPI: DSDT 0x0000000438437518 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:14:16.383832 kernel: ACPI: APIC 0x000000043843FC18 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:14:16.383838 kernel: ACPI: PPTT 0x000000043843D898 000114 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:14:16.383847 kernel: ACPI: GTDT 0x000000043843E898 000068 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:14:16.383853 kernel: ACPI: MCFG 0x000000043843FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:14:16.383860 kernel: ACPI: SPCR 0x000000043843E498 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:14:16.383866 kernel: ACPI: DBG2 0x000000043843E798 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:14:16.383873 kernel: ACPI: SRAT 0x000000043843E518 0000A0 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:14:16.383879 kernel: ACPI: IORT 0x000000043843E618 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:14:16.383886 kernel: ACPI: BGRT 0x000000043843E718 000038 (v01 INTEL EDK2 00000002 01000013) Dec 16 12:14:16.383892 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Dec 16 12:14:16.383898 kernel: ACPI: Use ACPI SPCR as default console: Yes Dec 16 12:14:16.383906 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000-0x43fffffff] Dec 16 12:14:16.383913 kernel: NODE_DATA(0) allocated [mem 0x43dff1a00-0x43dff8fff] Dec 16 12:14:16.383919 kernel: Zone ranges: Dec 16 12:14:16.383925 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Dec 16 12:14:16.383931 kernel: DMA32 empty Dec 16 12:14:16.383938 kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Dec 16 12:14:16.383944 kernel: Device empty Dec 16 12:14:16.383950 kernel: Movable zone start for each node Dec 16 12:14:16.383956 kernel: Early memory node ranges Dec 16 12:14:16.383963 kernel: node 0: [mem 0x0000000040000000-0x000000043843ffff] Dec 16 12:14:16.383969 kernel: node 0: [mem 0x0000000438440000-0x000000043872ffff] Dec 16 12:14:16.383975 kernel: node 0: [mem 0x0000000438730000-0x000000043bbfffff] Dec 16 12:14:16.383983 kernel: node 0: [mem 0x000000043bc00000-0x000000043bfdffff] Dec 16 12:14:16.383989 kernel: node 0: [mem 0x000000043bfe0000-0x000000043fffffff] Dec 16 12:14:16.383996 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x000000043fffffff] Dec 16 12:14:16.384002 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Dec 16 12:14:16.384009 kernel: psci: probing for conduit method from ACPI. Dec 16 12:14:16.384017 kernel: psci: PSCIv1.3 detected in firmware. Dec 16 12:14:16.384026 kernel: psci: Using standard PSCI v0.2 function IDs Dec 16 12:14:16.384032 kernel: psci: Trusted OS migration not required Dec 16 12:14:16.384039 kernel: psci: SMC Calling Convention v1.1 Dec 16 12:14:16.384046 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Dec 16 12:14:16.384053 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Dec 16 12:14:16.384059 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Dec 16 12:14:16.384066 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x2 -> Node 0 Dec 16 12:14:16.384092 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x3 -> Node 0 Dec 16 12:14:16.384108 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Dec 16 12:14:16.384116 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Dec 16 12:14:16.384123 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Dec 16 12:14:16.384130 kernel: Detected PIPT I-cache on CPU0 Dec 16 12:14:16.384137 kernel: CPU features: detected: GIC system register CPU interface Dec 16 12:14:16.384143 kernel: CPU features: detected: Spectre-v4 Dec 16 12:14:16.384150 kernel: CPU features: detected: Spectre-BHB Dec 16 12:14:16.384157 kernel: CPU features: kernel page table isolation forced ON by KASLR Dec 16 12:14:16.384164 kernel: CPU features: detected: Kernel page table isolation (KPTI) Dec 16 12:14:16.384170 kernel: CPU features: detected: ARM erratum 1418040 Dec 16 12:14:16.384179 kernel: CPU features: detected: SSBS not fully self-synchronizing Dec 16 12:14:16.384189 kernel: alternatives: applying boot alternatives Dec 16 12:14:16.384197 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=756b815c2fd7ac2947efceb2a88878d1ea9723ec85037c2b4d1a09bd798bb749 Dec 16 12:14:16.384204 kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Dec 16 12:14:16.384211 kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Dec 16 12:14:16.384217 kernel: Fallback order for Node 0: 0 Dec 16 12:14:16.384224 kernel: Built 1 zonelists, mobility grouping on. Total pages: 4194304 Dec 16 12:14:16.384231 kernel: Policy zone: Normal Dec 16 12:14:16.384240 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Dec 16 12:14:16.384247 kernel: software IO TLB: area num 4. Dec 16 12:14:16.384254 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Dec 16 12:14:16.384262 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Dec 16 12:14:16.384269 kernel: rcu: Preemptible hierarchical RCU implementation. Dec 16 12:14:16.384276 kernel: rcu: RCU event tracing is enabled. Dec 16 12:14:16.384283 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Dec 16 12:14:16.384290 kernel: Trampoline variant of Tasks RCU enabled. Dec 16 12:14:16.384297 kernel: Tracing variant of Tasks RCU enabled. Dec 16 12:14:16.384304 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Dec 16 12:14:16.384311 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Dec 16 12:14:16.384317 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Dec 16 12:14:16.384324 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Dec 16 12:14:16.384331 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Dec 16 12:14:16.384339 kernel: GICv3: 256 SPIs implemented Dec 16 12:14:16.384346 kernel: GICv3: 0 Extended SPIs implemented Dec 16 12:14:16.384353 kernel: Root IRQ handler: gic_handle_irq Dec 16 12:14:16.384359 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Dec 16 12:14:16.384366 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Dec 16 12:14:16.384373 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Dec 16 12:14:16.384380 kernel: ITS [mem 0x08080000-0x0809ffff] Dec 16 12:14:16.384387 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100110000 (indirect, esz 8, psz 64K, shr 1) Dec 16 12:14:16.384393 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100120000 (flat, esz 8, psz 64K, shr 1) Dec 16 12:14:16.384400 kernel: GICv3: using LPI property table @0x0000000100130000 Dec 16 12:14:16.384407 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100140000 Dec 16 12:14:16.384414 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Dec 16 12:14:16.384422 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 16 12:14:16.384429 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Dec 16 12:14:16.384435 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Dec 16 12:14:16.384443 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Dec 16 12:14:16.384449 kernel: arm-pv: using stolen time PV Dec 16 12:14:16.384460 kernel: Console: colour dummy device 80x25 Dec 16 12:14:16.384467 kernel: ACPI: Core revision 20240827 Dec 16 12:14:16.384474 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Dec 16 12:14:16.384485 kernel: pid_max: default: 32768 minimum: 301 Dec 16 12:14:16.384493 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Dec 16 12:14:16.384500 kernel: landlock: Up and running. Dec 16 12:14:16.384507 kernel: SELinux: Initializing. Dec 16 12:14:16.384514 kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 16 12:14:16.384521 kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 16 12:14:16.384528 kernel: rcu: Hierarchical SRCU implementation. Dec 16 12:14:16.384536 kernel: rcu: Max phase no-delay instances is 400. Dec 16 12:14:16.384544 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Dec 16 12:14:16.384551 kernel: Remapping and enabling EFI services. Dec 16 12:14:16.384558 kernel: smp: Bringing up secondary CPUs ... Dec 16 12:14:16.384565 kernel: Detected PIPT I-cache on CPU1 Dec 16 12:14:16.384573 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Dec 16 12:14:16.384580 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100150000 Dec 16 12:14:16.384587 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 16 12:14:16.384595 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Dec 16 12:14:16.384602 kernel: Detected PIPT I-cache on CPU2 Dec 16 12:14:16.384616 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Dec 16 12:14:16.384625 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000100160000 Dec 16 12:14:16.384632 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 16 12:14:16.384640 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Dec 16 12:14:16.384647 kernel: Detected PIPT I-cache on CPU3 Dec 16 12:14:16.384654 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Dec 16 12:14:16.384663 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000100170000 Dec 16 12:14:16.384671 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 16 12:14:16.384678 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Dec 16 12:14:16.384685 kernel: smp: Brought up 1 node, 4 CPUs Dec 16 12:14:16.384692 kernel: SMP: Total of 4 processors activated. Dec 16 12:14:16.384700 kernel: CPU: All CPU(s) started at EL1 Dec 16 12:14:16.384708 kernel: CPU features: detected: 32-bit EL0 Support Dec 16 12:14:16.384716 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Dec 16 12:14:16.384724 kernel: CPU features: detected: Common not Private translations Dec 16 12:14:16.384731 kernel: CPU features: detected: CRC32 instructions Dec 16 12:14:16.384738 kernel: CPU features: detected: Enhanced Virtualization Traps Dec 16 12:14:16.384746 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Dec 16 12:14:16.384753 kernel: CPU features: detected: LSE atomic instructions Dec 16 12:14:16.384762 kernel: CPU features: detected: Privileged Access Never Dec 16 12:14:16.384770 kernel: CPU features: detected: RAS Extension Support Dec 16 12:14:16.384777 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Dec 16 12:14:16.384784 kernel: alternatives: applying system-wide alternatives Dec 16 12:14:16.384792 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Dec 16 12:14:16.384800 kernel: Memory: 16324432K/16777216K available (11200K kernel code, 2456K rwdata, 9084K rodata, 12480K init, 1038K bss, 430000K reserved, 16384K cma-reserved) Dec 16 12:14:16.384810 kernel: devtmpfs: initialized Dec 16 12:14:16.384818 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Dec 16 12:14:16.384827 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Dec 16 12:14:16.384834 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Dec 16 12:14:16.384842 kernel: 0 pages in range for non-PLT usage Dec 16 12:14:16.384849 kernel: 515168 pages in range for PLT usage Dec 16 12:14:16.384857 kernel: pinctrl core: initialized pinctrl subsystem Dec 16 12:14:16.384864 kernel: SMBIOS 3.0.0 present. Dec 16 12:14:16.384871 kernel: DMI: QEMU KVM Virtual Machine, BIOS 0.0.0 02/06/2015 Dec 16 12:14:16.384880 kernel: DMI: Memory slots populated: 1/1 Dec 16 12:14:16.384887 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Dec 16 12:14:16.384895 kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Dec 16 12:14:16.384903 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Dec 16 12:14:16.384911 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Dec 16 12:14:16.384918 kernel: audit: initializing netlink subsys (disabled) Dec 16 12:14:16.384926 kernel: audit: type=2000 audit(0.036:1): state=initialized audit_enabled=0 res=1 Dec 16 12:14:16.384934 kernel: thermal_sys: Registered thermal governor 'step_wise' Dec 16 12:14:16.384942 kernel: cpuidle: using governor menu Dec 16 12:14:16.384949 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Dec 16 12:14:16.384957 kernel: ASID allocator initialised with 32768 entries Dec 16 12:14:16.384964 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Dec 16 12:14:16.384972 kernel: Serial: AMBA PL011 UART driver Dec 16 12:14:16.384979 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Dec 16 12:14:16.384988 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Dec 16 12:14:16.384995 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Dec 16 12:14:16.385003 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Dec 16 12:14:16.385010 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Dec 16 12:14:16.385018 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Dec 16 12:14:16.385025 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Dec 16 12:14:16.385033 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Dec 16 12:14:16.385040 kernel: ACPI: Added _OSI(Module Device) Dec 16 12:14:16.385049 kernel: ACPI: Added _OSI(Processor Device) Dec 16 12:14:16.385056 kernel: ACPI: Added _OSI(Processor Aggregator Device) Dec 16 12:14:16.385063 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Dec 16 12:14:16.385078 kernel: ACPI: Interpreter enabled Dec 16 12:14:16.385088 kernel: ACPI: Using GIC for interrupt routing Dec 16 12:14:16.385098 kernel: ACPI: MCFG table detected, 1 entries Dec 16 12:14:16.385105 kernel: ACPI: CPU0 has been hot-added Dec 16 12:14:16.385114 kernel: ACPI: CPU1 has been hot-added Dec 16 12:14:16.385122 kernel: ACPI: CPU2 has been hot-added Dec 16 12:14:16.385129 kernel: ACPI: CPU3 has been hot-added Dec 16 12:14:16.385136 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Dec 16 12:14:16.385144 kernel: printk: legacy console [ttyAMA0] enabled Dec 16 12:14:16.385152 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Dec 16 12:14:16.385314 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Dec 16 12:14:16.385412 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Dec 16 12:14:16.385493 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Dec 16 12:14:16.385572 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Dec 16 12:14:16.385672 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Dec 16 12:14:16.385682 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Dec 16 12:14:16.385689 kernel: PCI host bridge to bus 0000:00 Dec 16 12:14:16.385782 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Dec 16 12:14:16.385857 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Dec 16 12:14:16.385937 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Dec 16 12:14:16.386009 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Dec 16 12:14:16.386141 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Dec 16 12:14:16.386245 kernel: pci 0000:00:01.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.386334 kernel: pci 0000:00:01.0: BAR 0 [mem 0x125a0000-0x125a0fff] Dec 16 12:14:16.386414 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Dec 16 12:14:16.386493 kernel: pci 0000:00:01.0: bridge window [mem 0x12400000-0x124fffff] Dec 16 12:14:16.386571 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Dec 16 12:14:16.386658 kernel: pci 0000:00:01.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.386739 kernel: pci 0000:00:01.1: BAR 0 [mem 0x1259f000-0x1259ffff] Dec 16 12:14:16.386818 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Dec 16 12:14:16.386896 kernel: pci 0000:00:01.1: bridge window [mem 0x12300000-0x123fffff] Dec 16 12:14:16.386981 kernel: pci 0000:00:01.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.387059 kernel: pci 0000:00:01.2: BAR 0 [mem 0x1259e000-0x1259efff] Dec 16 12:14:16.387167 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Dec 16 12:14:16.387264 kernel: pci 0000:00:01.2: bridge window [mem 0x12200000-0x122fffff] Dec 16 12:14:16.387344 kernel: pci 0000:00:01.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Dec 16 12:14:16.387430 kernel: pci 0000:00:01.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.387509 kernel: pci 0000:00:01.3: BAR 0 [mem 0x1259d000-0x1259dfff] Dec 16 12:14:16.387592 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Dec 16 12:14:16.387674 kernel: pci 0000:00:01.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Dec 16 12:14:16.387760 kernel: pci 0000:00:01.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.387838 kernel: pci 0000:00:01.4: BAR 0 [mem 0x1259c000-0x1259cfff] Dec 16 12:14:16.387940 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Dec 16 12:14:16.388018 kernel: pci 0000:00:01.4: bridge window [mem 0x12100000-0x121fffff] Dec 16 12:14:16.388122 kernel: pci 0000:00:01.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Dec 16 12:14:16.388221 kernel: pci 0000:00:01.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.388302 kernel: pci 0000:00:01.5: BAR 0 [mem 0x1259b000-0x1259bfff] Dec 16 12:14:16.388380 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Dec 16 12:14:16.388457 kernel: pci 0000:00:01.5: bridge window [mem 0x12000000-0x120fffff] Dec 16 12:14:16.388540 kernel: pci 0000:00:01.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Dec 16 12:14:16.388626 kernel: pci 0000:00:01.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.388711 kernel: pci 0000:00:01.6: BAR 0 [mem 0x1259a000-0x1259afff] Dec 16 12:14:16.388790 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Dec 16 12:14:16.388879 kernel: pci 0000:00:01.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.388958 kernel: pci 0000:00:01.7: BAR 0 [mem 0x12599000-0x12599fff] Dec 16 12:14:16.389037 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Dec 16 12:14:16.389140 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.389230 kernel: pci 0000:00:02.0: BAR 0 [mem 0x12598000-0x12598fff] Dec 16 12:14:16.389308 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Dec 16 12:14:16.389396 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.389475 kernel: pci 0000:00:02.1: BAR 0 [mem 0x12597000-0x12597fff] Dec 16 12:14:16.389555 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Dec 16 12:14:16.389656 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.389735 kernel: pci 0000:00:02.2: BAR 0 [mem 0x12596000-0x12596fff] Dec 16 12:14:16.389813 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Dec 16 12:14:16.389898 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.389980 kernel: pci 0000:00:02.3: BAR 0 [mem 0x12595000-0x12595fff] Dec 16 12:14:16.390062 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Dec 16 12:14:16.390169 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.390255 kernel: pci 0000:00:02.4: BAR 0 [mem 0x12594000-0x12594fff] Dec 16 12:14:16.390344 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Dec 16 12:14:16.390435 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.390513 kernel: pci 0000:00:02.5: BAR 0 [mem 0x12593000-0x12593fff] Dec 16 12:14:16.390593 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Dec 16 12:14:16.390683 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.390766 kernel: pci 0000:00:02.6: BAR 0 [mem 0x12592000-0x12592fff] Dec 16 12:14:16.390854 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Dec 16 12:14:16.390940 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.391022 kernel: pci 0000:00:02.7: BAR 0 [mem 0x12591000-0x12591fff] Dec 16 12:14:16.391129 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Dec 16 12:14:16.391228 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.391312 kernel: pci 0000:00:03.0: BAR 0 [mem 0x12590000-0x12590fff] Dec 16 12:14:16.391396 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Dec 16 12:14:16.391484 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.391570 kernel: pci 0000:00:03.1: BAR 0 [mem 0x1258f000-0x1258ffff] Dec 16 12:14:16.391653 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Dec 16 12:14:16.391734 kernel: pci 0000:00:03.1: bridge window [io 0xf000-0xffff] Dec 16 12:14:16.391813 kernel: pci 0000:00:03.1: bridge window [mem 0x11e00000-0x11ffffff] Dec 16 12:14:16.391902 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.391980 kernel: pci 0000:00:03.2: BAR 0 [mem 0x1258e000-0x1258efff] Dec 16 12:14:16.392065 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Dec 16 12:14:16.392186 kernel: pci 0000:00:03.2: bridge window [io 0xe000-0xefff] Dec 16 12:14:16.392321 kernel: pci 0000:00:03.2: bridge window [mem 0x11c00000-0x11dfffff] Dec 16 12:14:16.392414 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.392494 kernel: pci 0000:00:03.3: BAR 0 [mem 0x1258d000-0x1258dfff] Dec 16 12:14:16.392579 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Dec 16 12:14:16.392661 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff] Dec 16 12:14:16.392742 kernel: pci 0000:00:03.3: bridge window [mem 0x11a00000-0x11bfffff] Dec 16 12:14:16.392831 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.392910 kernel: pci 0000:00:03.4: BAR 0 [mem 0x1258c000-0x1258cfff] Dec 16 12:14:16.392988 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Dec 16 12:14:16.393066 kernel: pci 0000:00:03.4: bridge window [io 0xc000-0xcfff] Dec 16 12:14:16.393172 kernel: pci 0000:00:03.4: bridge window [mem 0x11800000-0x119fffff] Dec 16 12:14:16.393267 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.393346 kernel: pci 0000:00:03.5: BAR 0 [mem 0x1258b000-0x1258bfff] Dec 16 12:14:16.393429 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Dec 16 12:14:16.393512 kernel: pci 0000:00:03.5: bridge window [io 0xb000-0xbfff] Dec 16 12:14:16.393602 kernel: pci 0000:00:03.5: bridge window [mem 0x11600000-0x117fffff] Dec 16 12:14:16.393698 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.393778 kernel: pci 0000:00:03.6: BAR 0 [mem 0x1258a000-0x1258afff] Dec 16 12:14:16.393860 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Dec 16 12:14:16.393940 kernel: pci 0000:00:03.6: bridge window [io 0xa000-0xafff] Dec 16 12:14:16.394017 kernel: pci 0000:00:03.6: bridge window [mem 0x11400000-0x115fffff] Dec 16 12:14:16.394128 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.394214 kernel: pci 0000:00:03.7: BAR 0 [mem 0x12589000-0x12589fff] Dec 16 12:14:16.394292 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Dec 16 12:14:16.394371 kernel: pci 0000:00:03.7: bridge window [io 0x9000-0x9fff] Dec 16 12:14:16.394449 kernel: pci 0000:00:03.7: bridge window [mem 0x11200000-0x113fffff] Dec 16 12:14:16.394537 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.394618 kernel: pci 0000:00:04.0: BAR 0 [mem 0x12588000-0x12588fff] Dec 16 12:14:16.394705 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Dec 16 12:14:16.394789 kernel: pci 0000:00:04.0: bridge window [io 0x8000-0x8fff] Dec 16 12:14:16.394867 kernel: pci 0000:00:04.0: bridge window [mem 0x11000000-0x111fffff] Dec 16 12:14:16.394953 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.395032 kernel: pci 0000:00:04.1: BAR 0 [mem 0x12587000-0x12587fff] Dec 16 12:14:16.395126 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Dec 16 12:14:16.395209 kernel: pci 0000:00:04.1: bridge window [io 0x7000-0x7fff] Dec 16 12:14:16.395288 kernel: pci 0000:00:04.1: bridge window [mem 0x10e00000-0x10ffffff] Dec 16 12:14:16.395373 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.395452 kernel: pci 0000:00:04.2: BAR 0 [mem 0x12586000-0x12586fff] Dec 16 12:14:16.395533 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Dec 16 12:14:16.395611 kernel: pci 0000:00:04.2: bridge window [io 0x6000-0x6fff] Dec 16 12:14:16.395698 kernel: pci 0000:00:04.2: bridge window [mem 0x10c00000-0x10dfffff] Dec 16 12:14:16.395783 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.395862 kernel: pci 0000:00:04.3: BAR 0 [mem 0x12585000-0x12585fff] Dec 16 12:14:16.395947 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Dec 16 12:14:16.396044 kernel: pci 0000:00:04.3: bridge window [io 0x5000-0x5fff] Dec 16 12:14:16.396133 kernel: pci 0000:00:04.3: bridge window [mem 0x10a00000-0x10bfffff] Dec 16 12:14:16.396229 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.396338 kernel: pci 0000:00:04.4: BAR 0 [mem 0x12584000-0x12584fff] Dec 16 12:14:16.396419 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Dec 16 12:14:16.396502 kernel: pci 0000:00:04.4: bridge window [io 0x4000-0x4fff] Dec 16 12:14:16.396579 kernel: pci 0000:00:04.4: bridge window [mem 0x10800000-0x109fffff] Dec 16 12:14:16.396663 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.396750 kernel: pci 0000:00:04.5: BAR 0 [mem 0x12583000-0x12583fff] Dec 16 12:14:16.396835 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Dec 16 12:14:16.396914 kernel: pci 0000:00:04.5: bridge window [io 0x3000-0x3fff] Dec 16 12:14:16.396998 kernel: pci 0000:00:04.5: bridge window [mem 0x10600000-0x107fffff] Dec 16 12:14:16.397097 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.397190 kernel: pci 0000:00:04.6: BAR 0 [mem 0x12582000-0x12582fff] Dec 16 12:14:16.397270 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Dec 16 12:14:16.397355 kernel: pci 0000:00:04.6: bridge window [io 0x2000-0x2fff] Dec 16 12:14:16.397441 kernel: pci 0000:00:04.6: bridge window [mem 0x10400000-0x105fffff] Dec 16 12:14:16.397535 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.397627 kernel: pci 0000:00:04.7: BAR 0 [mem 0x12581000-0x12581fff] Dec 16 12:14:16.397712 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Dec 16 12:14:16.397791 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x1fff] Dec 16 12:14:16.397869 kernel: pci 0000:00:04.7: bridge window [mem 0x10200000-0x103fffff] Dec 16 12:14:16.397954 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:14:16.398054 kernel: pci 0000:00:05.0: BAR 0 [mem 0x12580000-0x12580fff] Dec 16 12:14:16.398164 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Dec 16 12:14:16.398247 kernel: pci 0000:00:05.0: bridge window [io 0x0000-0x0fff] Dec 16 12:14:16.398327 kernel: pci 0000:00:05.0: bridge window [mem 0x10000000-0x101fffff] Dec 16 12:14:16.398420 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Dec 16 12:14:16.398503 kernel: pci 0000:01:00.0: BAR 1 [mem 0x12400000-0x12400fff] Dec 16 12:14:16.398589 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Dec 16 12:14:16.398681 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Dec 16 12:14:16.398772 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Dec 16 12:14:16.398854 kernel: pci 0000:02:00.0: BAR 0 [mem 0x12300000-0x12303fff 64bit] Dec 16 12:14:16.398944 kernel: pci 0000:03:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Dec 16 12:14:16.399028 kernel: pci 0000:03:00.0: BAR 1 [mem 0x12200000-0x12200fff] Dec 16 12:14:16.399123 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Dec 16 12:14:16.399215 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Dec 16 12:14:16.399301 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Dec 16 12:14:16.399401 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Dec 16 12:14:16.399496 kernel: pci 0000:05:00.0: BAR 1 [mem 0x12100000-0x12100fff] Dec 16 12:14:16.399581 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Dec 16 12:14:16.399673 kernel: pci 0000:06:00.0: [1af4:1050] type 00 class 0x038000 PCIe Endpoint Dec 16 12:14:16.399758 kernel: pci 0000:06:00.0: BAR 1 [mem 0x12000000-0x12000fff] Dec 16 12:14:16.399842 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Dec 16 12:14:16.399931 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Dec 16 12:14:16.400036 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Dec 16 12:14:16.400135 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Dec 16 12:14:16.400234 kernel: pci 0000:00:01.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Dec 16 12:14:16.400331 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Dec 16 12:14:16.400417 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Dec 16 12:14:16.400500 kernel: pci 0000:00:01.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Dec 16 12:14:16.400581 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Dec 16 12:14:16.400659 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Dec 16 12:14:16.400741 kernel: pci 0000:00:01.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Dec 16 12:14:16.400823 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Dec 16 12:14:16.400905 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Dec 16 12:14:16.400996 kernel: pci 0000:00:01.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Dec 16 12:14:16.401093 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Dec 16 12:14:16.401173 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Dec 16 12:14:16.401256 kernel: pci 0000:00:01.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Dec 16 12:14:16.401347 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Dec 16 12:14:16.401439 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Dec 16 12:14:16.401530 kernel: pci 0000:00:01.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Dec 16 12:14:16.401621 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 07] add_size 200000 add_align 100000 Dec 16 12:14:16.401706 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff] to [bus 07] add_size 200000 add_align 100000 Dec 16 12:14:16.401790 kernel: pci 0000:00:01.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Dec 16 12:14:16.401872 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Dec 16 12:14:16.401974 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Dec 16 12:14:16.402062 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Dec 16 12:14:16.402161 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Dec 16 12:14:16.402243 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Dec 16 12:14:16.402329 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Dec 16 12:14:16.402413 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0a] add_size 200000 add_align 100000 Dec 16 12:14:16.402492 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff] to [bus 0a] add_size 200000 add_align 100000 Dec 16 12:14:16.402580 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Dec 16 12:14:16.402659 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Dec 16 12:14:16.402738 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff] to [bus 0b] add_size 200000 add_align 100000 Dec 16 12:14:16.402833 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Dec 16 12:14:16.402916 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0c] add_size 200000 add_align 100000 Dec 16 12:14:16.402999 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 0c] add_size 200000 add_align 100000 Dec 16 12:14:16.403097 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Dec 16 12:14:16.403182 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0d] add_size 200000 add_align 100000 Dec 16 12:14:16.403264 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 0d] add_size 200000 add_align 100000 Dec 16 12:14:16.403351 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Dec 16 12:14:16.403430 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0e] add_size 200000 add_align 100000 Dec 16 12:14:16.403508 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff] to [bus 0e] add_size 200000 add_align 100000 Dec 16 12:14:16.403595 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Dec 16 12:14:16.403677 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0f] add_size 200000 add_align 100000 Dec 16 12:14:16.403758 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff] to [bus 0f] add_size 200000 add_align 100000 Dec 16 12:14:16.403841 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Dec 16 12:14:16.403919 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 10] add_size 200000 add_align 100000 Dec 16 12:14:16.404002 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 10] add_size 200000 add_align 100000 Dec 16 12:14:16.404102 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Dec 16 12:14:16.404185 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 11] add_size 200000 add_align 100000 Dec 16 12:14:16.404267 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 11] add_size 200000 add_align 100000 Dec 16 12:14:16.404350 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Dec 16 12:14:16.404430 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 12] add_size 200000 add_align 100000 Dec 16 12:14:16.404509 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff] to [bus 12] add_size 200000 add_align 100000 Dec 16 12:14:16.404590 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Dec 16 12:14:16.404671 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 13] add_size 200000 add_align 100000 Dec 16 12:14:16.404750 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff] to [bus 13] add_size 200000 add_align 100000 Dec 16 12:14:16.404831 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Dec 16 12:14:16.404910 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 14] add_size 200000 add_align 100000 Dec 16 12:14:16.404989 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff] to [bus 14] add_size 200000 add_align 100000 Dec 16 12:14:16.405070 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Dec 16 12:14:16.405176 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 15] add_size 200000 add_align 100000 Dec 16 12:14:16.405256 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff] to [bus 15] add_size 200000 add_align 100000 Dec 16 12:14:16.405342 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Dec 16 12:14:16.405422 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 16] add_size 200000 add_align 100000 Dec 16 12:14:16.405500 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff] to [bus 16] add_size 200000 add_align 100000 Dec 16 12:14:16.405598 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Dec 16 12:14:16.405691 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 17] add_size 200000 add_align 100000 Dec 16 12:14:16.405770 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff] to [bus 17] add_size 200000 add_align 100000 Dec 16 12:14:16.405853 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Dec 16 12:14:16.405936 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 18] add_size 200000 add_align 100000 Dec 16 12:14:16.406022 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff] to [bus 18] add_size 200000 add_align 100000 Dec 16 12:14:16.406123 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Dec 16 12:14:16.406213 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 19] add_size 200000 add_align 100000 Dec 16 12:14:16.406298 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 19] add_size 200000 add_align 100000 Dec 16 12:14:16.406381 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Dec 16 12:14:16.406467 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1a] add_size 200000 add_align 100000 Dec 16 12:14:16.406547 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff] to [bus 1a] add_size 200000 add_align 100000 Dec 16 12:14:16.406643 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Dec 16 12:14:16.406728 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1b] add_size 200000 add_align 100000 Dec 16 12:14:16.406806 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff] to [bus 1b] add_size 200000 add_align 100000 Dec 16 12:14:16.406895 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Dec 16 12:14:16.406976 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1c] add_size 200000 add_align 100000 Dec 16 12:14:16.407054 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff] to [bus 1c] add_size 200000 add_align 100000 Dec 16 12:14:16.407156 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Dec 16 12:14:16.407242 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1d] add_size 200000 add_align 100000 Dec 16 12:14:16.407322 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff] to [bus 1d] add_size 200000 add_align 100000 Dec 16 12:14:16.407409 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Dec 16 12:14:16.407489 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1e] add_size 200000 add_align 100000 Dec 16 12:14:16.407574 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff] to [bus 1e] add_size 200000 add_align 100000 Dec 16 12:14:16.407656 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Dec 16 12:14:16.407741 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1f] add_size 200000 add_align 100000 Dec 16 12:14:16.407823 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff] to [bus 1f] add_size 200000 add_align 100000 Dec 16 12:14:16.407906 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Dec 16 12:14:16.407985 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 20] add_size 200000 add_align 100000 Dec 16 12:14:16.408065 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff] to [bus 20] add_size 200000 add_align 100000 Dec 16 12:14:16.408172 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Dec 16 12:14:16.408263 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 21] add_size 200000 add_align 100000 Dec 16 12:14:16.408346 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff] to [bus 21] add_size 200000 add_align 100000 Dec 16 12:14:16.408427 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff]: assigned Dec 16 12:14:16.408506 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Dec 16 12:14:16.408587 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff]: assigned Dec 16 12:14:16.408669 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Dec 16 12:14:16.408752 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff]: assigned Dec 16 12:14:16.408831 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Dec 16 12:14:16.408913 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff]: assigned Dec 16 12:14:16.408993 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Dec 16 12:14:16.409097 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff]: assigned Dec 16 12:14:16.409186 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Dec 16 12:14:16.409269 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Dec 16 12:14:16.409354 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Dec 16 12:14:16.409440 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Dec 16 12:14:16.409521 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Dec 16 12:14:16.409615 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Dec 16 12:14:16.409700 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Dec 16 12:14:16.409787 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff]: assigned Dec 16 12:14:16.409866 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Dec 16 12:14:16.409950 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff]: assigned Dec 16 12:14:16.410033 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref]: assigned Dec 16 12:14:16.410136 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff]: assigned Dec 16 12:14:16.410224 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref]: assigned Dec 16 12:14:16.410306 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff]: assigned Dec 16 12:14:16.410387 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref]: assigned Dec 16 12:14:16.410472 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff]: assigned Dec 16 12:14:16.410553 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref]: assigned Dec 16 12:14:16.410636 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff]: assigned Dec 16 12:14:16.410719 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref]: assigned Dec 16 12:14:16.410799 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff]: assigned Dec 16 12:14:16.410881 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref]: assigned Dec 16 12:14:16.410968 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff]: assigned Dec 16 12:14:16.411055 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref]: assigned Dec 16 12:14:16.411157 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff]: assigned Dec 16 12:14:16.411243 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref]: assigned Dec 16 12:14:16.411323 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff]: assigned Dec 16 12:14:16.411411 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref]: assigned Dec 16 12:14:16.411519 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff]: assigned Dec 16 12:14:16.411601 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref]: assigned Dec 16 12:14:16.411687 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff]: assigned Dec 16 12:14:16.411764 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref]: assigned Dec 16 12:14:16.411847 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff]: assigned Dec 16 12:14:16.411927 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref]: assigned Dec 16 12:14:16.412027 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff]: assigned Dec 16 12:14:16.412131 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref]: assigned Dec 16 12:14:16.412222 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff]: assigned Dec 16 12:14:16.412311 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref]: assigned Dec 16 12:14:16.412398 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff]: assigned Dec 16 12:14:16.412483 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref]: assigned Dec 16 12:14:16.412575 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff]: assigned Dec 16 12:14:16.412657 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref]: assigned Dec 16 12:14:16.412743 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff]: assigned Dec 16 12:14:16.412823 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref]: assigned Dec 16 12:14:16.412905 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff]: assigned Dec 16 12:14:16.412985 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref]: assigned Dec 16 12:14:16.413066 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff]: assigned Dec 16 12:14:16.413192 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref]: assigned Dec 16 12:14:16.413276 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff]: assigned Dec 16 12:14:16.413357 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref]: assigned Dec 16 12:14:16.413443 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff]: assigned Dec 16 12:14:16.413522 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref]: assigned Dec 16 12:14:16.413621 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff]: assigned Dec 16 12:14:16.413714 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref]: assigned Dec 16 12:14:16.413804 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff]: assigned Dec 16 12:14:16.413887 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref]: assigned Dec 16 12:14:16.413973 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff]: assigned Dec 16 12:14:16.414059 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref]: assigned Dec 16 12:14:16.414171 kernel: pci 0000:00:01.0: BAR 0 [mem 0x14200000-0x14200fff]: assigned Dec 16 12:14:16.414260 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x1fff]: assigned Dec 16 12:14:16.414348 kernel: pci 0000:00:01.1: BAR 0 [mem 0x14201000-0x14201fff]: assigned Dec 16 12:14:16.414427 kernel: pci 0000:00:01.1: bridge window [io 0x2000-0x2fff]: assigned Dec 16 12:14:16.414508 kernel: pci 0000:00:01.2: BAR 0 [mem 0x14202000-0x14202fff]: assigned Dec 16 12:14:16.414591 kernel: pci 0000:00:01.2: bridge window [io 0x3000-0x3fff]: assigned Dec 16 12:14:16.414671 kernel: pci 0000:00:01.3: BAR 0 [mem 0x14203000-0x14203fff]: assigned Dec 16 12:14:16.414765 kernel: pci 0000:00:01.3: bridge window [io 0x4000-0x4fff]: assigned Dec 16 12:14:16.414848 kernel: pci 0000:00:01.4: BAR 0 [mem 0x14204000-0x14204fff]: assigned Dec 16 12:14:16.414938 kernel: pci 0000:00:01.4: bridge window [io 0x5000-0x5fff]: assigned Dec 16 12:14:16.415023 kernel: pci 0000:00:01.5: BAR 0 [mem 0x14205000-0x14205fff]: assigned Dec 16 12:14:16.415134 kernel: pci 0000:00:01.5: bridge window [io 0x6000-0x6fff]: assigned Dec 16 12:14:16.415229 kernel: pci 0000:00:01.6: BAR 0 [mem 0x14206000-0x14206fff]: assigned Dec 16 12:14:16.415316 kernel: pci 0000:00:01.6: bridge window [io 0x7000-0x7fff]: assigned Dec 16 12:14:16.415397 kernel: pci 0000:00:01.7: BAR 0 [mem 0x14207000-0x14207fff]: assigned Dec 16 12:14:16.415476 kernel: pci 0000:00:01.7: bridge window [io 0x8000-0x8fff]: assigned Dec 16 12:14:16.415556 kernel: pci 0000:00:02.0: BAR 0 [mem 0x14208000-0x14208fff]: assigned Dec 16 12:14:16.415635 kernel: pci 0000:00:02.0: bridge window [io 0x9000-0x9fff]: assigned Dec 16 12:14:16.415718 kernel: pci 0000:00:02.1: BAR 0 [mem 0x14209000-0x14209fff]: assigned Dec 16 12:14:16.415802 kernel: pci 0000:00:02.1: bridge window [io 0xa000-0xafff]: assigned Dec 16 12:14:16.415884 kernel: pci 0000:00:02.2: BAR 0 [mem 0x1420a000-0x1420afff]: assigned Dec 16 12:14:16.415964 kernel: pci 0000:00:02.2: bridge window [io 0xb000-0xbfff]: assigned Dec 16 12:14:16.416048 kernel: pci 0000:00:02.3: BAR 0 [mem 0x1420b000-0x1420bfff]: assigned Dec 16 12:14:16.416147 kernel: pci 0000:00:02.3: bridge window [io 0xc000-0xcfff]: assigned Dec 16 12:14:16.416228 kernel: pci 0000:00:02.4: BAR 0 [mem 0x1420c000-0x1420cfff]: assigned Dec 16 12:14:16.416315 kernel: pci 0000:00:02.4: bridge window [io 0xd000-0xdfff]: assigned Dec 16 12:14:16.416399 kernel: pci 0000:00:02.5: BAR 0 [mem 0x1420d000-0x1420dfff]: assigned Dec 16 12:14:16.416482 kernel: pci 0000:00:02.5: bridge window [io 0xe000-0xefff]: assigned Dec 16 12:14:16.416569 kernel: pci 0000:00:02.6: BAR 0 [mem 0x1420e000-0x1420efff]: assigned Dec 16 12:14:16.416656 kernel: pci 0000:00:02.6: bridge window [io 0xf000-0xffff]: assigned Dec 16 12:14:16.416761 kernel: pci 0000:00:02.7: BAR 0 [mem 0x1420f000-0x1420ffff]: assigned Dec 16 12:14:16.416852 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.416937 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.417019 kernel: pci 0000:00:03.0: BAR 0 [mem 0x14210000-0x14210fff]: assigned Dec 16 12:14:16.417131 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.417219 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.417299 kernel: pci 0000:00:03.1: BAR 0 [mem 0x14211000-0x14211fff]: assigned Dec 16 12:14:16.417388 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.417476 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.417556 kernel: pci 0000:00:03.2: BAR 0 [mem 0x14212000-0x14212fff]: assigned Dec 16 12:14:16.417653 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.417738 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.417823 kernel: pci 0000:00:03.3: BAR 0 [mem 0x14213000-0x14213fff]: assigned Dec 16 12:14:16.417904 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.417996 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.418092 kernel: pci 0000:00:03.4: BAR 0 [mem 0x14214000-0x14214fff]: assigned Dec 16 12:14:16.418187 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.418280 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.418369 kernel: pci 0000:00:03.5: BAR 0 [mem 0x14215000-0x14215fff]: assigned Dec 16 12:14:16.418453 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.418532 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.418614 kernel: pci 0000:00:03.6: BAR 0 [mem 0x14216000-0x14216fff]: assigned Dec 16 12:14:16.418701 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.418791 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.418878 kernel: pci 0000:00:03.7: BAR 0 [mem 0x14217000-0x14217fff]: assigned Dec 16 12:14:16.418961 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.419044 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.419151 kernel: pci 0000:00:04.0: BAR 0 [mem 0x14218000-0x14218fff]: assigned Dec 16 12:14:16.419231 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.419322 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.419413 kernel: pci 0000:00:04.1: BAR 0 [mem 0x14219000-0x14219fff]: assigned Dec 16 12:14:16.419510 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.419594 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.419679 kernel: pci 0000:00:04.2: BAR 0 [mem 0x1421a000-0x1421afff]: assigned Dec 16 12:14:16.419759 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.419837 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.419924 kernel: pci 0000:00:04.3: BAR 0 [mem 0x1421b000-0x1421bfff]: assigned Dec 16 12:14:16.420006 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.420100 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.420189 kernel: pci 0000:00:04.4: BAR 0 [mem 0x1421c000-0x1421cfff]: assigned Dec 16 12:14:16.420273 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.420351 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.420432 kernel: pci 0000:00:04.5: BAR 0 [mem 0x1421d000-0x1421dfff]: assigned Dec 16 12:14:16.420511 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.420593 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.420678 kernel: pci 0000:00:04.6: BAR 0 [mem 0x1421e000-0x1421efff]: assigned Dec 16 12:14:16.420758 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.420836 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.420916 kernel: pci 0000:00:04.7: BAR 0 [mem 0x1421f000-0x1421ffff]: assigned Dec 16 12:14:16.420996 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.421092 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.421183 kernel: pci 0000:00:05.0: BAR 0 [mem 0x14220000-0x14220fff]: assigned Dec 16 12:14:16.421264 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.421343 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.421423 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff]: assigned Dec 16 12:14:16.421505 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff]: assigned Dec 16 12:14:16.421603 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff]: assigned Dec 16 12:14:16.421689 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff]: assigned Dec 16 12:14:16.421770 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff]: assigned Dec 16 12:14:16.421860 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff]: assigned Dec 16 12:14:16.421943 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff]: assigned Dec 16 12:14:16.422024 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff]: assigned Dec 16 12:14:16.422117 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff]: assigned Dec 16 12:14:16.422210 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff]: assigned Dec 16 12:14:16.422291 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff]: assigned Dec 16 12:14:16.422371 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff]: assigned Dec 16 12:14:16.422450 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff]: assigned Dec 16 12:14:16.422534 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff]: assigned Dec 16 12:14:16.422619 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff]: assigned Dec 16 12:14:16.422707 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.422787 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.422868 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.422951 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.423035 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.423127 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.423217 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.423301 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.423383 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.423465 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.423546 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.423628 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.423709 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.423788 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.423869 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.423951 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.424031 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.424126 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.424219 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.424305 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.424386 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.424475 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.424560 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.424640 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.424728 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.424817 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.424904 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.424988 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.425070 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.425183 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.425270 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.425361 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.425458 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.425539 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.425637 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:14:16.425718 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: failed to assign Dec 16 12:14:16.425810 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Dec 16 12:14:16.425896 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Dec 16 12:14:16.425980 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Dec 16 12:14:16.426059 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Dec 16 12:14:16.426158 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff] Dec 16 12:14:16.426243 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Dec 16 12:14:16.426329 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Dec 16 12:14:16.426408 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Dec 16 12:14:16.426490 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff] Dec 16 12:14:16.426574 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Dec 16 12:14:16.426663 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Dec 16 12:14:16.426748 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Dec 16 12:14:16.426839 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Dec 16 12:14:16.426927 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff] Dec 16 12:14:16.427009 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Dec 16 12:14:16.427106 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Dec 16 12:14:16.427194 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Dec 16 12:14:16.427280 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff] Dec 16 12:14:16.427366 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Dec 16 12:14:16.427455 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Dec 16 12:14:16.427544 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Dec 16 12:14:16.427634 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Dec 16 12:14:16.427719 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff] Dec 16 12:14:16.427804 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Dec 16 12:14:16.427891 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Dec 16 12:14:16.427973 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Dec 16 12:14:16.428068 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Dec 16 12:14:16.428164 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff] Dec 16 12:14:16.428254 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Dec 16 12:14:16.428343 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Dec 16 12:14:16.428427 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff] Dec 16 12:14:16.428507 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Dec 16 12:14:16.428591 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Dec 16 12:14:16.428678 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff] Dec 16 12:14:16.428759 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Dec 16 12:14:16.428842 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Dec 16 12:14:16.428920 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Dec 16 12:14:16.428999 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Dec 16 12:14:16.429092 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Dec 16 12:14:16.429176 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff] Dec 16 12:14:16.429260 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref] Dec 16 12:14:16.429340 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Dec 16 12:14:16.429422 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff] Dec 16 12:14:16.429503 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref] Dec 16 12:14:16.429593 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Dec 16 12:14:16.429678 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff] Dec 16 12:14:16.429761 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref] Dec 16 12:14:16.429841 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Dec 16 12:14:16.429920 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff] Dec 16 12:14:16.430001 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref] Dec 16 12:14:16.430092 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Dec 16 12:14:16.430176 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff] Dec 16 12:14:16.430270 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref] Dec 16 12:14:16.430365 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Dec 16 12:14:16.430456 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff] Dec 16 12:14:16.430550 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref] Dec 16 12:14:16.430630 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Dec 16 12:14:16.430713 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff] Dec 16 12:14:16.430799 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref] Dec 16 12:14:16.430882 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Dec 16 12:14:16.430970 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff] Dec 16 12:14:16.431060 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref] Dec 16 12:14:16.431154 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Dec 16 12:14:16.431247 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff] Dec 16 12:14:16.431327 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref] Dec 16 12:14:16.431410 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Dec 16 12:14:16.431494 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff] Dec 16 12:14:16.431576 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff] Dec 16 12:14:16.431664 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref] Dec 16 12:14:16.431750 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Dec 16 12:14:16.431833 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff] Dec 16 12:14:16.431917 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff] Dec 16 12:14:16.432002 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref] Dec 16 12:14:16.432130 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Dec 16 12:14:16.432218 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff] Dec 16 12:14:16.432306 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff] Dec 16 12:14:16.432387 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref] Dec 16 12:14:16.432472 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Dec 16 12:14:16.432558 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff] Dec 16 12:14:16.432640 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff] Dec 16 12:14:16.432724 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref] Dec 16 12:14:16.432810 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Dec 16 12:14:16.432911 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff] Dec 16 12:14:16.432990 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff] Dec 16 12:14:16.433078 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref] Dec 16 12:14:16.433165 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Dec 16 12:14:16.433245 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff] Dec 16 12:14:16.433323 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff] Dec 16 12:14:16.433413 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref] Dec 16 12:14:16.433494 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Dec 16 12:14:16.433587 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff] Dec 16 12:14:16.433674 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff] Dec 16 12:14:16.433759 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref] Dec 16 12:14:16.433841 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Dec 16 12:14:16.433921 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff] Dec 16 12:14:16.434009 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff] Dec 16 12:14:16.434106 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref] Dec 16 12:14:16.434203 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Dec 16 12:14:16.434294 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff] Dec 16 12:14:16.434373 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff] Dec 16 12:14:16.434453 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref] Dec 16 12:14:16.434535 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Dec 16 12:14:16.434614 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff] Dec 16 12:14:16.434715 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff] Dec 16 12:14:16.434794 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref] Dec 16 12:14:16.434888 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Dec 16 12:14:16.434970 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff] Dec 16 12:14:16.435049 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff] Dec 16 12:14:16.435149 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref] Dec 16 12:14:16.435231 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Dec 16 12:14:16.435311 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff] Dec 16 12:14:16.435395 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff] Dec 16 12:14:16.435481 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref] Dec 16 12:14:16.435564 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Dec 16 12:14:16.435648 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff] Dec 16 12:14:16.435728 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff] Dec 16 12:14:16.435806 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref] Dec 16 12:14:16.435888 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Dec 16 12:14:16.435970 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff] Dec 16 12:14:16.436049 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff] Dec 16 12:14:16.436145 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref] Dec 16 12:14:16.436237 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Dec 16 12:14:16.436320 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff] Dec 16 12:14:16.436407 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff] Dec 16 12:14:16.436486 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref] Dec 16 12:14:16.436568 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Dec 16 12:14:16.436641 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Dec 16 12:14:16.436714 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Dec 16 12:14:16.436798 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Dec 16 12:14:16.436872 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Dec 16 12:14:16.436953 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Dec 16 12:14:16.437028 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Dec 16 12:14:16.437128 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Dec 16 12:14:16.437204 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Dec 16 12:14:16.437285 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Dec 16 12:14:16.437359 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Dec 16 12:14:16.437455 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Dec 16 12:14:16.437530 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Dec 16 12:14:16.437623 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Dec 16 12:14:16.437698 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Dec 16 12:14:16.437778 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Dec 16 12:14:16.437854 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Dec 16 12:14:16.437940 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Dec 16 12:14:16.438014 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Dec 16 12:14:16.438115 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Dec 16 12:14:16.438191 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Dec 16 12:14:16.438275 kernel: pci_bus 0000:0a: resource 1 [mem 0x11200000-0x113fffff] Dec 16 12:14:16.438348 kernel: pci_bus 0000:0a: resource 2 [mem 0x8001200000-0x80013fffff 64bit pref] Dec 16 12:14:16.438427 kernel: pci_bus 0000:0b: resource 1 [mem 0x11400000-0x115fffff] Dec 16 12:14:16.438501 kernel: pci_bus 0000:0b: resource 2 [mem 0x8001400000-0x80015fffff 64bit pref] Dec 16 12:14:16.438579 kernel: pci_bus 0000:0c: resource 1 [mem 0x11600000-0x117fffff] Dec 16 12:14:16.438653 kernel: pci_bus 0000:0c: resource 2 [mem 0x8001600000-0x80017fffff 64bit pref] Dec 16 12:14:16.438732 kernel: pci_bus 0000:0d: resource 1 [mem 0x11800000-0x119fffff] Dec 16 12:14:16.438812 kernel: pci_bus 0000:0d: resource 2 [mem 0x8001800000-0x80019fffff 64bit pref] Dec 16 12:14:16.438894 kernel: pci_bus 0000:0e: resource 1 [mem 0x11a00000-0x11bfffff] Dec 16 12:14:16.438967 kernel: pci_bus 0000:0e: resource 2 [mem 0x8001a00000-0x8001bfffff 64bit pref] Dec 16 12:14:16.439049 kernel: pci_bus 0000:0f: resource 1 [mem 0x11c00000-0x11dfffff] Dec 16 12:14:16.439135 kernel: pci_bus 0000:0f: resource 2 [mem 0x8001c00000-0x8001dfffff 64bit pref] Dec 16 12:14:16.439226 kernel: pci_bus 0000:10: resource 1 [mem 0x11e00000-0x11ffffff] Dec 16 12:14:16.439305 kernel: pci_bus 0000:10: resource 2 [mem 0x8001e00000-0x8001ffffff 64bit pref] Dec 16 12:14:16.439385 kernel: pci_bus 0000:11: resource 1 [mem 0x12000000-0x121fffff] Dec 16 12:14:16.439459 kernel: pci_bus 0000:11: resource 2 [mem 0x8002000000-0x80021fffff 64bit pref] Dec 16 12:14:16.439541 kernel: pci_bus 0000:12: resource 1 [mem 0x12200000-0x123fffff] Dec 16 12:14:16.439614 kernel: pci_bus 0000:12: resource 2 [mem 0x8002200000-0x80023fffff 64bit pref] Dec 16 12:14:16.439698 kernel: pci_bus 0000:13: resource 0 [io 0xf000-0xffff] Dec 16 12:14:16.439772 kernel: pci_bus 0000:13: resource 1 [mem 0x12400000-0x125fffff] Dec 16 12:14:16.439854 kernel: pci_bus 0000:13: resource 2 [mem 0x8002400000-0x80025fffff 64bit pref] Dec 16 12:14:16.439939 kernel: pci_bus 0000:14: resource 0 [io 0xe000-0xefff] Dec 16 12:14:16.440013 kernel: pci_bus 0000:14: resource 1 [mem 0x12600000-0x127fffff] Dec 16 12:14:16.440103 kernel: pci_bus 0000:14: resource 2 [mem 0x8002600000-0x80027fffff 64bit pref] Dec 16 12:14:16.440184 kernel: pci_bus 0000:15: resource 0 [io 0xd000-0xdfff] Dec 16 12:14:16.440258 kernel: pci_bus 0000:15: resource 1 [mem 0x12800000-0x129fffff] Dec 16 12:14:16.440339 kernel: pci_bus 0000:15: resource 2 [mem 0x8002800000-0x80029fffff 64bit pref] Dec 16 12:14:16.440426 kernel: pci_bus 0000:16: resource 0 [io 0xc000-0xcfff] Dec 16 12:14:16.440505 kernel: pci_bus 0000:16: resource 1 [mem 0x12a00000-0x12bfffff] Dec 16 12:14:16.440584 kernel: pci_bus 0000:16: resource 2 [mem 0x8002a00000-0x8002bfffff 64bit pref] Dec 16 12:14:16.440665 kernel: pci_bus 0000:17: resource 0 [io 0xb000-0xbfff] Dec 16 12:14:16.440741 kernel: pci_bus 0000:17: resource 1 [mem 0x12c00000-0x12dfffff] Dec 16 12:14:16.440820 kernel: pci_bus 0000:17: resource 2 [mem 0x8002c00000-0x8002dfffff 64bit pref] Dec 16 12:14:16.440902 kernel: pci_bus 0000:18: resource 0 [io 0xa000-0xafff] Dec 16 12:14:16.440981 kernel: pci_bus 0000:18: resource 1 [mem 0x12e00000-0x12ffffff] Dec 16 12:14:16.441063 kernel: pci_bus 0000:18: resource 2 [mem 0x8002e00000-0x8002ffffff 64bit pref] Dec 16 12:14:16.441167 kernel: pci_bus 0000:19: resource 0 [io 0x9000-0x9fff] Dec 16 12:14:16.441258 kernel: pci_bus 0000:19: resource 1 [mem 0x13000000-0x131fffff] Dec 16 12:14:16.441331 kernel: pci_bus 0000:19: resource 2 [mem 0x8003000000-0x80031fffff 64bit pref] Dec 16 12:14:16.441410 kernel: pci_bus 0000:1a: resource 0 [io 0x8000-0x8fff] Dec 16 12:14:16.441488 kernel: pci_bus 0000:1a: resource 1 [mem 0x13200000-0x133fffff] Dec 16 12:14:16.441561 kernel: pci_bus 0000:1a: resource 2 [mem 0x8003200000-0x80033fffff 64bit pref] Dec 16 12:14:16.441667 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Dec 16 12:14:16.441748 kernel: pci_bus 0000:1b: resource 1 [mem 0x13400000-0x135fffff] Dec 16 12:14:16.441821 kernel: pci_bus 0000:1b: resource 2 [mem 0x8003400000-0x80035fffff 64bit pref] Dec 16 12:14:16.441912 kernel: pci_bus 0000:1c: resource 0 [io 0x6000-0x6fff] Dec 16 12:14:16.441988 kernel: pci_bus 0000:1c: resource 1 [mem 0x13600000-0x137fffff] Dec 16 12:14:16.442089 kernel: pci_bus 0000:1c: resource 2 [mem 0x8003600000-0x80037fffff 64bit pref] Dec 16 12:14:16.442189 kernel: pci_bus 0000:1d: resource 0 [io 0x5000-0x5fff] Dec 16 12:14:16.442270 kernel: pci_bus 0000:1d: resource 1 [mem 0x13800000-0x139fffff] Dec 16 12:14:16.442345 kernel: pci_bus 0000:1d: resource 2 [mem 0x8003800000-0x80039fffff 64bit pref] Dec 16 12:14:16.442424 kernel: pci_bus 0000:1e: resource 0 [io 0x4000-0x4fff] Dec 16 12:14:16.442498 kernel: pci_bus 0000:1e: resource 1 [mem 0x13a00000-0x13bfffff] Dec 16 12:14:16.442571 kernel: pci_bus 0000:1e: resource 2 [mem 0x8003a00000-0x8003bfffff 64bit pref] Dec 16 12:14:16.442654 kernel: pci_bus 0000:1f: resource 0 [io 0x3000-0x3fff] Dec 16 12:14:16.442729 kernel: pci_bus 0000:1f: resource 1 [mem 0x13c00000-0x13dfffff] Dec 16 12:14:16.442802 kernel: pci_bus 0000:1f: resource 2 [mem 0x8003c00000-0x8003dfffff 64bit pref] Dec 16 12:14:16.442888 kernel: pci_bus 0000:20: resource 0 [io 0x2000-0x2fff] Dec 16 12:14:16.442963 kernel: pci_bus 0000:20: resource 1 [mem 0x13e00000-0x13ffffff] Dec 16 12:14:16.443036 kernel: pci_bus 0000:20: resource 2 [mem 0x8003e00000-0x8003ffffff 64bit pref] Dec 16 12:14:16.443142 kernel: pci_bus 0000:21: resource 0 [io 0x1000-0x1fff] Dec 16 12:14:16.443224 kernel: pci_bus 0000:21: resource 1 [mem 0x14000000-0x141fffff] Dec 16 12:14:16.443297 kernel: pci_bus 0000:21: resource 2 [mem 0x8004000000-0x80041fffff 64bit pref] Dec 16 12:14:16.443308 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Dec 16 12:14:16.443316 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Dec 16 12:14:16.443324 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Dec 16 12:14:16.443334 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Dec 16 12:14:16.443342 kernel: iommu: Default domain type: Translated Dec 16 12:14:16.443350 kernel: iommu: DMA domain TLB invalidation policy: strict mode Dec 16 12:14:16.443359 kernel: efivars: Registered efivars operations Dec 16 12:14:16.443367 kernel: vgaarb: loaded Dec 16 12:14:16.443375 kernel: clocksource: Switched to clocksource arch_sys_counter Dec 16 12:14:16.443383 kernel: VFS: Disk quotas dquot_6.6.0 Dec 16 12:14:16.443392 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Dec 16 12:14:16.443401 kernel: pnp: PnP ACPI init Dec 16 12:14:16.443496 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Dec 16 12:14:16.443508 kernel: pnp: PnP ACPI: found 1 devices Dec 16 12:14:16.443516 kernel: NET: Registered PF_INET protocol family Dec 16 12:14:16.443524 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Dec 16 12:14:16.443532 kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Dec 16 12:14:16.443542 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Dec 16 12:14:16.443550 kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Dec 16 12:14:16.443559 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Dec 16 12:14:16.443566 kernel: TCP: Hash tables configured (established 131072 bind 65536) Dec 16 12:14:16.443574 kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Dec 16 12:14:16.443583 kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Dec 16 12:14:16.443591 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Dec 16 12:14:16.443679 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Dec 16 12:14:16.443690 kernel: PCI: CLS 0 bytes, default 64 Dec 16 12:14:16.443698 kernel: kvm [1]: HYP mode not available Dec 16 12:14:16.443707 kernel: Initialise system trusted keyrings Dec 16 12:14:16.443715 kernel: workingset: timestamp_bits=39 max_order=22 bucket_order=0 Dec 16 12:14:16.443722 kernel: Key type asymmetric registered Dec 16 12:14:16.443730 kernel: Asymmetric key parser 'x509' registered Dec 16 12:14:16.443740 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Dec 16 12:14:16.443748 kernel: io scheduler mq-deadline registered Dec 16 12:14:16.443756 kernel: io scheduler kyber registered Dec 16 12:14:16.443764 kernel: io scheduler bfq registered Dec 16 12:14:16.443772 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Dec 16 12:14:16.443853 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 50 Dec 16 12:14:16.443933 kernel: pcieport 0000:00:01.0: AER: enabled with IRQ 50 Dec 16 12:14:16.444014 kernel: pcieport 0000:00:01.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.444113 kernel: pcieport 0000:00:01.1: PME: Signaling with IRQ 51 Dec 16 12:14:16.444202 kernel: pcieport 0000:00:01.1: AER: enabled with IRQ 51 Dec 16 12:14:16.444281 kernel: pcieport 0000:00:01.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.444362 kernel: pcieport 0000:00:01.2: PME: Signaling with IRQ 52 Dec 16 12:14:16.444441 kernel: pcieport 0000:00:01.2: AER: enabled with IRQ 52 Dec 16 12:14:16.444521 kernel: pcieport 0000:00:01.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.444603 kernel: pcieport 0000:00:01.3: PME: Signaling with IRQ 53 Dec 16 12:14:16.444681 kernel: pcieport 0000:00:01.3: AER: enabled with IRQ 53 Dec 16 12:14:16.444759 kernel: pcieport 0000:00:01.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.444839 kernel: pcieport 0000:00:01.4: PME: Signaling with IRQ 54 Dec 16 12:14:16.444917 kernel: pcieport 0000:00:01.4: AER: enabled with IRQ 54 Dec 16 12:14:16.445002 kernel: pcieport 0000:00:01.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.445105 kernel: pcieport 0000:00:01.5: PME: Signaling with IRQ 55 Dec 16 12:14:16.445203 kernel: pcieport 0000:00:01.5: AER: enabled with IRQ 55 Dec 16 12:14:16.445286 kernel: pcieport 0000:00:01.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.445373 kernel: pcieport 0000:00:01.6: PME: Signaling with IRQ 56 Dec 16 12:14:16.445461 kernel: pcieport 0000:00:01.6: AER: enabled with IRQ 56 Dec 16 12:14:16.445545 kernel: pcieport 0000:00:01.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.445646 kernel: pcieport 0000:00:01.7: PME: Signaling with IRQ 57 Dec 16 12:14:16.445727 kernel: pcieport 0000:00:01.7: AER: enabled with IRQ 57 Dec 16 12:14:16.445809 kernel: pcieport 0000:00:01.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.445820 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Dec 16 12:14:16.445897 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 58 Dec 16 12:14:16.445977 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 58 Dec 16 12:14:16.446083 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.446192 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 59 Dec 16 12:14:16.446274 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 59 Dec 16 12:14:16.446362 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.446448 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 60 Dec 16 12:14:16.446529 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 60 Dec 16 12:14:16.446609 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.446695 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 61 Dec 16 12:14:16.446780 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 61 Dec 16 12:14:16.446859 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.446947 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 62 Dec 16 12:14:16.447029 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 62 Dec 16 12:14:16.447135 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.447239 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 63 Dec 16 12:14:16.447342 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 63 Dec 16 12:14:16.447423 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.447508 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 64 Dec 16 12:14:16.447587 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 64 Dec 16 12:14:16.447666 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.447750 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 65 Dec 16 12:14:16.447829 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 65 Dec 16 12:14:16.447909 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.447920 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Dec 16 12:14:16.447997 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 66 Dec 16 12:14:16.448090 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 66 Dec 16 12:14:16.448190 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.448276 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 67 Dec 16 12:14:16.448355 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 67 Dec 16 12:14:16.448442 kernel: pcieport 0000:00:03.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.448524 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 68 Dec 16 12:14:16.448605 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 68 Dec 16 12:14:16.448688 kernel: pcieport 0000:00:03.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.448784 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 69 Dec 16 12:14:16.448873 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 69 Dec 16 12:14:16.448957 kernel: pcieport 0000:00:03.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.449046 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 70 Dec 16 12:14:16.449153 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 70 Dec 16 12:14:16.449242 kernel: pcieport 0000:00:03.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.449329 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 71 Dec 16 12:14:16.449414 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 71 Dec 16 12:14:16.449500 kernel: pcieport 0000:00:03.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.449594 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 72 Dec 16 12:14:16.449675 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 72 Dec 16 12:14:16.449756 kernel: pcieport 0000:00:03.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.449847 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 73 Dec 16 12:14:16.449928 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 73 Dec 16 12:14:16.450006 kernel: pcieport 0000:00:03.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.450017 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Dec 16 12:14:16.450110 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 74 Dec 16 12:14:16.450201 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 74 Dec 16 12:14:16.450286 kernel: pcieport 0000:00:04.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.450380 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 75 Dec 16 12:14:16.450461 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 75 Dec 16 12:14:16.450545 kernel: pcieport 0000:00:04.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.450630 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 76 Dec 16 12:14:16.450714 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 76 Dec 16 12:14:16.450794 kernel: pcieport 0000:00:04.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.450892 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 77 Dec 16 12:14:16.450973 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 77 Dec 16 12:14:16.451052 kernel: pcieport 0000:00:04.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.451164 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 78 Dec 16 12:14:16.451258 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 78 Dec 16 12:14:16.451351 kernel: pcieport 0000:00:04.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.451445 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 79 Dec 16 12:14:16.451527 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 79 Dec 16 12:14:16.451607 kernel: pcieport 0000:00:04.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.451693 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 80 Dec 16 12:14:16.451772 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 80 Dec 16 12:14:16.451855 kernel: pcieport 0000:00:04.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.451942 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 81 Dec 16 12:14:16.452032 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 81 Dec 16 12:14:16.452130 kernel: pcieport 0000:00:04.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.452236 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 82 Dec 16 12:14:16.452322 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 82 Dec 16 12:14:16.452402 kernel: pcieport 0000:00:05.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:14:16.452413 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Dec 16 12:14:16.452428 kernel: ACPI: button: Power Button [PWRB] Dec 16 12:14:16.452515 kernel: virtio-pci 0000:01:00.0: enabling device (0000 -> 0002) Dec 16 12:14:16.452601 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Dec 16 12:14:16.452612 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Dec 16 12:14:16.452620 kernel: thunder_xcv, ver 1.0 Dec 16 12:14:16.452628 kernel: thunder_bgx, ver 1.0 Dec 16 12:14:16.452636 kernel: nicpf, ver 1.0 Dec 16 12:14:16.452646 kernel: nicvf, ver 1.0 Dec 16 12:14:16.452745 kernel: rtc-efi rtc-efi.0: registered as rtc0 Dec 16 12:14:16.452824 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-12-16T12:14:15 UTC (1765887255) Dec 16 12:14:16.452835 kernel: hid: raw HID events driver (C) Jiri Kosina Dec 16 12:14:16.452843 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Dec 16 12:14:16.452851 kernel: watchdog: NMI not fully supported Dec 16 12:14:16.452861 kernel: watchdog: Hard watchdog permanently disabled Dec 16 12:14:16.452869 kernel: NET: Registered PF_INET6 protocol family Dec 16 12:14:16.452877 kernel: Segment Routing with IPv6 Dec 16 12:14:16.452889 kernel: In-situ OAM (IOAM) with IPv6 Dec 16 12:14:16.452897 kernel: NET: Registered PF_PACKET protocol family Dec 16 12:14:16.452905 kernel: Key type dns_resolver registered Dec 16 12:14:16.452913 kernel: registered taskstats version 1 Dec 16 12:14:16.452921 kernel: Loading compiled-in X.509 certificates Dec 16 12:14:16.452933 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.61-flatcar: 545838337a91b65b763486e536766b3eec3ef99d' Dec 16 12:14:16.452941 kernel: Demotion targets for Node 0: null Dec 16 12:14:16.452949 kernel: Key type .fscrypt registered Dec 16 12:14:16.452957 kernel: Key type fscrypt-provisioning registered Dec 16 12:14:16.452964 kernel: ima: No TPM chip found, activating TPM-bypass! Dec 16 12:14:16.452973 kernel: ima: Allocated hash algorithm: sha1 Dec 16 12:14:16.452981 kernel: ima: No architecture policies found Dec 16 12:14:16.452991 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Dec 16 12:14:16.452999 kernel: clk: Disabling unused clocks Dec 16 12:14:16.453006 kernel: PM: genpd: Disabling unused power domains Dec 16 12:14:16.453015 kernel: Freeing unused kernel memory: 12480K Dec 16 12:14:16.453023 kernel: Run /init as init process Dec 16 12:14:16.453031 kernel: with arguments: Dec 16 12:14:16.453038 kernel: /init Dec 16 12:14:16.453047 kernel: with environment: Dec 16 12:14:16.453055 kernel: HOME=/ Dec 16 12:14:16.453063 kernel: TERM=linux Dec 16 12:14:16.453084 kernel: ACPI: bus type USB registered Dec 16 12:14:16.453097 kernel: usbcore: registered new interface driver usbfs Dec 16 12:14:16.453105 kernel: usbcore: registered new interface driver hub Dec 16 12:14:16.453114 kernel: usbcore: registered new device driver usb Dec 16 12:14:16.453219 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 16 12:14:16.453308 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Dec 16 12:14:16.453399 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Dec 16 12:14:16.453493 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 16 12:14:16.453591 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Dec 16 12:14:16.453682 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Dec 16 12:14:16.453795 kernel: hub 1-0:1.0: USB hub found Dec 16 12:14:16.453905 kernel: hub 1-0:1.0: 4 ports detected Dec 16 12:14:16.454015 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Dec 16 12:14:16.454157 kernel: hub 2-0:1.0: USB hub found Dec 16 12:14:16.454272 kernel: hub 2-0:1.0: 4 ports detected Dec 16 12:14:16.454380 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Dec 16 12:14:16.454471 kernel: virtio_blk virtio1: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Dec 16 12:14:16.454482 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Dec 16 12:14:16.454491 kernel: GPT:25804799 != 104857599 Dec 16 12:14:16.454499 kernel: GPT:Alternate GPT header not at the end of the disk. Dec 16 12:14:16.454508 kernel: GPT:25804799 != 104857599 Dec 16 12:14:16.454516 kernel: GPT: Use GNU Parted to correct GPT errors. Dec 16 12:14:16.454529 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 16 12:14:16.454537 kernel: SCSI subsystem initialized Dec 16 12:14:16.454545 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Dec 16 12:14:16.454554 kernel: device-mapper: uevent: version 1.0.3 Dec 16 12:14:16.454564 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Dec 16 12:14:16.454573 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Dec 16 12:14:16.454581 kernel: raid6: neonx8 gen() 15474 MB/s Dec 16 12:14:16.454591 kernel: raid6: neonx4 gen() 15692 MB/s Dec 16 12:14:16.454599 kernel: raid6: neonx2 gen() 13296 MB/s Dec 16 12:14:16.454607 kernel: raid6: neonx1 gen() 10548 MB/s Dec 16 12:14:16.454621 kernel: raid6: int64x8 gen() 6852 MB/s Dec 16 12:14:16.454629 kernel: raid6: int64x4 gen() 7349 MB/s Dec 16 12:14:16.454638 kernel: raid6: int64x2 gen() 6118 MB/s Dec 16 12:14:16.454646 kernel: raid6: int64x1 gen() 5069 MB/s Dec 16 12:14:16.454655 kernel: raid6: using algorithm neonx4 gen() 15692 MB/s Dec 16 12:14:16.454664 kernel: raid6: .... xor() 12320 MB/s, rmw enabled Dec 16 12:14:16.454672 kernel: raid6: using neon recovery algorithm Dec 16 12:14:16.454680 kernel: xor: measuring software checksum speed Dec 16 12:14:16.454690 kernel: 8regs : 21573 MB/sec Dec 16 12:14:16.454699 kernel: 32regs : 20010 MB/sec Dec 16 12:14:16.454708 kernel: arm64_neon : 28167 MB/sec Dec 16 12:14:16.454719 kernel: xor: using function: arm64_neon (28167 MB/sec) Dec 16 12:14:16.454831 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Dec 16 12:14:16.454844 kernel: Btrfs loaded, zoned=no, fsverity=no Dec 16 12:14:16.454853 kernel: BTRFS: device fsid d00a2bc5-1c68-4957-aa37-d070193fcf05 devid 1 transid 36 /dev/mapper/usr (253:0) scanned by mount (274) Dec 16 12:14:16.454862 kernel: BTRFS info (device dm-0): first mount of filesystem d00a2bc5-1c68-4957-aa37-d070193fcf05 Dec 16 12:14:16.454870 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Dec 16 12:14:16.454881 kernel: BTRFS info (device dm-0): disabling log replay at mount time Dec 16 12:14:16.454889 kernel: BTRFS info (device dm-0): enabling free space tree Dec 16 12:14:16.454897 kernel: loop: module loaded Dec 16 12:14:16.454906 kernel: loop0: detected capacity change from 0 to 91832 Dec 16 12:14:16.454914 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Dec 16 12:14:16.454923 systemd[1]: Successfully made /usr/ read-only. Dec 16 12:14:16.454936 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 16 12:14:16.454945 systemd[1]: Detected virtualization kvm. Dec 16 12:14:16.455049 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Dec 16 12:14:16.455066 systemd[1]: Detected architecture arm64. Dec 16 12:14:16.455094 systemd[1]: Running in initrd. Dec 16 12:14:16.455103 systemd[1]: No hostname configured, using default hostname. Dec 16 12:14:16.455116 systemd[1]: Hostname set to . Dec 16 12:14:16.455127 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Dec 16 12:14:16.455137 systemd[1]: Queued start job for default target initrd.target. Dec 16 12:14:16.455147 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Dec 16 12:14:16.455156 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 12:14:16.455168 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 12:14:16.455186 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Dec 16 12:14:16.455196 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 16 12:14:16.455205 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Dec 16 12:14:16.455214 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Dec 16 12:14:16.455223 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 12:14:16.455232 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 16 12:14:16.455242 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Dec 16 12:14:16.455250 systemd[1]: Reached target paths.target - Path Units. Dec 16 12:14:16.455259 systemd[1]: Reached target slices.target - Slice Units. Dec 16 12:14:16.455270 systemd[1]: Reached target swap.target - Swaps. Dec 16 12:14:16.455279 systemd[1]: Reached target timers.target - Timer Units. Dec 16 12:14:16.455287 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Dec 16 12:14:16.455296 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 16 12:14:16.455309 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Dec 16 12:14:16.455317 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Dec 16 12:14:16.455326 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Dec 16 12:14:16.455335 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 16 12:14:16.455343 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 16 12:14:16.455352 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 12:14:16.455361 systemd[1]: Reached target sockets.target - Socket Units. Dec 16 12:14:16.455372 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Dec 16 12:14:16.455380 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Dec 16 12:14:16.455389 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 16 12:14:16.455398 systemd[1]: Finished network-cleanup.service - Network Cleanup. Dec 16 12:14:16.455407 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Dec 16 12:14:16.455416 systemd[1]: Starting systemd-fsck-usr.service... Dec 16 12:14:16.455426 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 16 12:14:16.455435 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 16 12:14:16.455444 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:14:16.455453 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Dec 16 12:14:16.455464 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 12:14:16.455473 systemd[1]: Finished systemd-fsck-usr.service. Dec 16 12:14:16.455482 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 16 12:14:16.455493 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Dec 16 12:14:16.455530 systemd-journald[417]: Collecting audit messages is enabled. Dec 16 12:14:16.455552 kernel: Bridge firewalling registered Dec 16 12:14:16.455561 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 16 12:14:16.455570 kernel: audit: type=1130 audit(1765887256.385:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:16.455579 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:14:16.455592 kernel: audit: type=1130 audit(1765887256.389:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:16.455604 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 12:14:16.455613 kernel: audit: type=1130 audit(1765887256.393:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:16.455622 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Dec 16 12:14:16.455631 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 16 12:14:16.455640 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 16 12:14:16.455650 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 12:14:16.455660 kernel: audit: type=1130 audit(1765887256.421:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:16.455669 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 16 12:14:16.455678 kernel: audit: type=1130 audit(1765887256.430:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:16.455687 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 16 12:14:16.455695 kernel: audit: type=1334 audit(1765887256.430:7): prog-id=6 op=LOAD Dec 16 12:14:16.455704 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 16 12:14:16.455714 kernel: audit: type=1130 audit(1765887256.438:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:16.455723 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Dec 16 12:14:16.455732 systemd-journald[417]: Journal started Dec 16 12:14:16.455751 systemd-journald[417]: Runtime Journal (/run/log/journal/595290ea0e29448aa7105217e31b2dd1) is 8M, max 319.5M, 311.5M free. Dec 16 12:14:16.385000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:16.389000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:16.393000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:16.421000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:16.430000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:16.430000 audit: BPF prog-id=6 op=LOAD Dec 16 12:14:16.438000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:16.382209 systemd-modules-load[418]: Inserted module 'br_netfilter' Dec 16 12:14:16.457459 systemd[1]: Started systemd-journald.service - Journal Service. Dec 16 12:14:16.458000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:16.461809 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 16 12:14:16.463691 kernel: audit: type=1130 audit(1765887256.458:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:16.474057 dracut-cmdline[447]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=756b815c2fd7ac2947efceb2a88878d1ea9723ec85037c2b4d1a09bd798bb749 Dec 16 12:14:16.483817 systemd-resolved[444]: Positive Trust Anchors: Dec 16 12:14:16.483826 systemd-tmpfiles[458]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Dec 16 12:14:16.483836 systemd-resolved[444]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 16 12:14:16.483840 systemd-resolved[444]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Dec 16 12:14:16.483871 systemd-resolved[444]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 16 12:14:16.495000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:16.488842 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 12:14:16.499103 kernel: audit: type=1130 audit(1765887256.495:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:16.511304 systemd-resolved[444]: Defaulting to hostname 'linux'. Dec 16 12:14:16.512252 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 16 12:14:16.513000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:16.513497 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 16 12:14:16.566112 kernel: Loading iSCSI transport class v2.0-870. Dec 16 12:14:16.577100 kernel: iscsi: registered transport (tcp) Dec 16 12:14:16.595111 kernel: iscsi: registered transport (qla4xxx) Dec 16 12:14:16.595138 kernel: QLogic iSCSI HBA Driver Dec 16 12:14:16.617689 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 16 12:14:16.646722 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 12:14:16.647000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:16.648750 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 16 12:14:16.693109 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Dec 16 12:14:16.693000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:16.695207 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Dec 16 12:14:16.696579 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Dec 16 12:14:16.734339 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Dec 16 12:14:16.735000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:16.735000 audit: BPF prog-id=7 op=LOAD Dec 16 12:14:16.735000 audit: BPF prog-id=8 op=LOAD Dec 16 12:14:16.736589 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 12:14:16.763695 systemd-udevd[697]: Using default interface naming scheme 'v257'. Dec 16 12:14:16.771633 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 12:14:16.772000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:16.776217 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Dec 16 12:14:16.799796 dracut-pre-trigger[768]: rd.md=0: removing MD RAID activation Dec 16 12:14:16.801029 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 16 12:14:16.802000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:16.802000 audit: BPF prog-id=9 op=LOAD Dec 16 12:14:16.804051 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 16 12:14:16.826287 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Dec 16 12:14:16.827000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:16.828103 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 16 12:14:16.852096 systemd-networkd[810]: lo: Link UP Dec 16 12:14:16.852104 systemd-networkd[810]: lo: Gained carrier Dec 16 12:14:16.853205 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 16 12:14:16.854000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:16.854264 systemd[1]: Reached target network.target - Network. Dec 16 12:14:16.917510 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 12:14:16.918000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:16.920660 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Dec 16 12:14:16.973230 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Dec 16 12:14:16.991680 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Dec 16 12:14:17.000811 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Dec 16 12:14:17.002681 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Dec 16 12:14:17.022068 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Dec 16 12:14:17.022129 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Dec 16 12:14:17.022723 disk-uuid[874]: Primary Header is updated. Dec 16 12:14:17.022723 disk-uuid[874]: Secondary Entries is updated. Dec 16 12:14:17.022723 disk-uuid[874]: Secondary Header is updated. Dec 16 12:14:17.023613 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Dec 16 12:14:17.031103 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Dec 16 12:14:17.064782 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 12:14:17.064904 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:14:17.066000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:17.066698 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:14:17.067619 systemd-networkd[810]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:14:17.067623 systemd-networkd[810]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 12:14:17.068732 systemd-networkd[810]: eth0: Link UP Dec 16 12:14:17.068890 systemd-networkd[810]: eth0: Gained carrier Dec 16 12:14:17.068899 systemd-networkd[810]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:14:17.069381 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:14:17.086205 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Dec 16 12:14:17.086410 kernel: usbcore: registered new interface driver usbhid Dec 16 12:14:17.087252 kernel: usbhid: USB HID core driver Dec 16 12:14:17.107207 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:14:17.107000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:17.145496 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Dec 16 12:14:17.146000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:17.146816 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Dec 16 12:14:17.148045 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 12:14:17.149996 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 16 12:14:17.152588 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Dec 16 12:14:17.179224 systemd-networkd[810]: eth0: DHCPv4 address 10.0.21.180/25, gateway 10.0.21.129 acquired from 10.0.21.129 Dec 16 12:14:17.184264 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Dec 16 12:14:17.185000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:18.052222 disk-uuid[875]: Warning: The kernel is still using the old partition table. Dec 16 12:14:18.052222 disk-uuid[875]: The new table will be used at the next reboot or after you Dec 16 12:14:18.052222 disk-uuid[875]: run partprobe(8) or kpartx(8) Dec 16 12:14:18.052222 disk-uuid[875]: The operation has completed successfully. Dec 16 12:14:18.062459 systemd[1]: disk-uuid.service: Deactivated successfully. Dec 16 12:14:18.063481 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Dec 16 12:14:18.064000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:18.064000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:18.065607 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Dec 16 12:14:18.104094 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (908) Dec 16 12:14:18.106461 kernel: BTRFS info (device vda6): first mount of filesystem eb4bb268-dde2-45a9-b660-8899d8790a47 Dec 16 12:14:18.106486 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Dec 16 12:14:18.111118 kernel: BTRFS info (device vda6): turning on async discard Dec 16 12:14:18.111180 kernel: BTRFS info (device vda6): enabling free space tree Dec 16 12:14:18.117285 kernel: BTRFS info (device vda6): last unmount of filesystem eb4bb268-dde2-45a9-b660-8899d8790a47 Dec 16 12:14:18.117390 systemd[1]: Finished ignition-setup.service - Ignition (setup). Dec 16 12:14:18.118000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:18.119116 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Dec 16 12:14:18.242883 ignition[927]: Ignition 2.24.0 Dec 16 12:14:18.242897 ignition[927]: Stage: fetch-offline Dec 16 12:14:18.242939 ignition[927]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:14:18.242953 ignition[927]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 12:14:18.244953 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Dec 16 12:14:18.247000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:18.243333 ignition[927]: parsed url from cmdline: "" Dec 16 12:14:18.248128 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Dec 16 12:14:18.243338 ignition[927]: no config URL provided Dec 16 12:14:18.243344 ignition[927]: reading system config file "/usr/lib/ignition/user.ign" Dec 16 12:14:18.243359 ignition[927]: no config at "/usr/lib/ignition/user.ign" Dec 16 12:14:18.243364 ignition[927]: failed to fetch config: resource requires networking Dec 16 12:14:18.243556 ignition[927]: Ignition finished successfully Dec 16 12:14:18.282520 ignition[940]: Ignition 2.24.0 Dec 16 12:14:18.282541 ignition[940]: Stage: fetch Dec 16 12:14:18.282691 ignition[940]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:14:18.282699 ignition[940]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 12:14:18.282782 ignition[940]: parsed url from cmdline: "" Dec 16 12:14:18.282785 ignition[940]: no config URL provided Dec 16 12:14:18.282789 ignition[940]: reading system config file "/usr/lib/ignition/user.ign" Dec 16 12:14:18.282794 ignition[940]: no config at "/usr/lib/ignition/user.ign" Dec 16 12:14:18.283199 ignition[940]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Dec 16 12:14:18.283215 ignition[940]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Dec 16 12:14:18.283454 ignition[940]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Dec 16 12:14:18.314398 systemd-networkd[810]: eth0: Gained IPv6LL Dec 16 12:14:18.608729 ignition[940]: GET result: OK Dec 16 12:14:18.608921 ignition[940]: parsing config with SHA512: 9a193c3a0449835aa741f5ad67b870c745a337cbfc48247e390078c0fc1c49bbc5525301b0af9fa4d949e190322c6bfae2e0d24d43cddf53ea00cfbced1e9553 Dec 16 12:14:18.613577 unknown[940]: fetched base config from "system" Dec 16 12:14:18.613588 unknown[940]: fetched base config from "system" Dec 16 12:14:18.613907 ignition[940]: fetch: fetch complete Dec 16 12:14:18.613593 unknown[940]: fetched user config from "openstack" Dec 16 12:14:18.620200 kernel: kauditd_printk_skb: 20 callbacks suppressed Dec 16 12:14:18.620223 kernel: audit: type=1130 audit(1765887258.617:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:18.617000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:18.613911 ignition[940]: fetch: fetch passed Dec 16 12:14:18.616252 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Dec 16 12:14:18.613950 ignition[940]: Ignition finished successfully Dec 16 12:14:18.618113 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Dec 16 12:14:18.642347 ignition[948]: Ignition 2.24.0 Dec 16 12:14:18.642365 ignition[948]: Stage: kargs Dec 16 12:14:18.642506 ignition[948]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:14:18.642515 ignition[948]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 12:14:18.645161 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Dec 16 12:14:18.646000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:18.649103 kernel: audit: type=1130 audit(1765887258.646:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:18.643263 ignition[948]: kargs: kargs passed Dec 16 12:14:18.647650 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Dec 16 12:14:18.643316 ignition[948]: Ignition finished successfully Dec 16 12:14:18.670253 ignition[955]: Ignition 2.24.0 Dec 16 12:14:18.670273 ignition[955]: Stage: disks Dec 16 12:14:18.670431 ignition[955]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:14:18.670439 ignition[955]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 12:14:18.671201 ignition[955]: disks: disks passed Dec 16 12:14:18.673000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:18.673002 systemd[1]: Finished ignition-disks.service - Ignition (disks). Dec 16 12:14:18.677716 kernel: audit: type=1130 audit(1765887258.673:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:18.671252 ignition[955]: Ignition finished successfully Dec 16 12:14:18.674180 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Dec 16 12:14:18.676987 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Dec 16 12:14:18.678633 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 16 12:14:18.679823 systemd[1]: Reached target sysinit.target - System Initialization. Dec 16 12:14:18.681308 systemd[1]: Reached target basic.target - Basic System. Dec 16 12:14:18.683854 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Dec 16 12:14:18.725221 systemd-fsck[964]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Dec 16 12:14:18.727891 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Dec 16 12:14:18.732154 kernel: audit: type=1130 audit(1765887258.729:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:18.729000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:18.729983 systemd[1]: Mounting sysroot.mount - /sysroot... Dec 16 12:14:18.834154 kernel: EXT4-fs (vda9): mounted filesystem 0e69f709-36a9-4e15-b0c9-c7e150185653 r/w with ordered data mode. Quota mode: none. Dec 16 12:14:18.835122 systemd[1]: Mounted sysroot.mount - /sysroot. Dec 16 12:14:18.836238 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Dec 16 12:14:18.839157 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 16 12:14:18.840833 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Dec 16 12:14:18.841766 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Dec 16 12:14:18.842600 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Dec 16 12:14:18.843542 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Dec 16 12:14:18.843577 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Dec 16 12:14:18.851226 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Dec 16 12:14:18.853767 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Dec 16 12:14:18.867791 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (973) Dec 16 12:14:18.867848 kernel: BTRFS info (device vda6): first mount of filesystem eb4bb268-dde2-45a9-b660-8899d8790a47 Dec 16 12:14:18.867861 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Dec 16 12:14:18.875337 kernel: BTRFS info (device vda6): turning on async discard Dec 16 12:14:18.875423 kernel: BTRFS info (device vda6): enabling free space tree Dec 16 12:14:18.876801 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 16 12:14:18.904133 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:14:19.003231 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Dec 16 12:14:19.007132 kernel: audit: type=1130 audit(1765887259.004:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:19.004000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:19.004953 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Dec 16 12:14:19.016901 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Dec 16 12:14:19.022829 systemd[1]: sysroot-oem.mount: Deactivated successfully. Dec 16 12:14:19.024340 kernel: BTRFS info (device vda6): last unmount of filesystem eb4bb268-dde2-45a9-b660-8899d8790a47 Dec 16 12:14:19.053417 ignition[1074]: INFO : Ignition 2.24.0 Dec 16 12:14:19.053417 ignition[1074]: INFO : Stage: mount Dec 16 12:14:19.055000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:19.058051 ignition[1074]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 12:14:19.058051 ignition[1074]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 12:14:19.058051 ignition[1074]: INFO : mount: mount passed Dec 16 12:14:19.058051 ignition[1074]: INFO : Ignition finished successfully Dec 16 12:14:19.064001 kernel: audit: type=1130 audit(1765887259.055:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:19.064028 kernel: audit: type=1130 audit(1765887259.058:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:19.058000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:19.054436 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Dec 16 12:14:19.058037 systemd[1]: Finished ignition-mount.service - Ignition (mount). Dec 16 12:14:19.935111 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:14:21.945107 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:14:25.950154 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:14:25.954337 coreos-metadata[975]: Dec 16 12:14:25.954 WARN failed to locate config-drive, using the metadata service API instead Dec 16 12:14:25.973144 coreos-metadata[975]: Dec 16 12:14:25.973 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Dec 16 12:14:27.570217 coreos-metadata[975]: Dec 16 12:14:27.570 INFO Fetch successful Dec 16 12:14:27.571348 coreos-metadata[975]: Dec 16 12:14:27.571 INFO wrote hostname ci-4547-0-0-0-5b424f63c8 to /sysroot/etc/hostname Dec 16 12:14:27.572808 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Dec 16 12:14:27.581527 kernel: audit: type=1130 audit(1765887267.575:38): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:27.581554 kernel: audit: type=1131 audit(1765887267.575:39): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:27.575000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:27.575000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:27.572935 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Dec 16 12:14:27.576678 systemd[1]: Starting ignition-files.service - Ignition (files)... Dec 16 12:14:27.596134 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 16 12:14:27.613091 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1092) Dec 16 12:14:27.615925 kernel: BTRFS info (device vda6): first mount of filesystem eb4bb268-dde2-45a9-b660-8899d8790a47 Dec 16 12:14:27.615968 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Dec 16 12:14:27.620312 kernel: BTRFS info (device vda6): turning on async discard Dec 16 12:14:27.620353 kernel: BTRFS info (device vda6): enabling free space tree Dec 16 12:14:27.621718 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 16 12:14:27.656395 ignition[1110]: INFO : Ignition 2.24.0 Dec 16 12:14:27.656395 ignition[1110]: INFO : Stage: files Dec 16 12:14:27.658237 ignition[1110]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 12:14:27.658237 ignition[1110]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 12:14:27.658237 ignition[1110]: DEBUG : files: compiled without relabeling support, skipping Dec 16 12:14:27.661698 ignition[1110]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Dec 16 12:14:27.661698 ignition[1110]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Dec 16 12:14:27.664385 ignition[1110]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Dec 16 12:14:27.664385 ignition[1110]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Dec 16 12:14:27.664385 ignition[1110]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Dec 16 12:14:27.663826 unknown[1110]: wrote ssh authorized keys file for user: core Dec 16 12:14:27.670450 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Dec 16 12:14:27.670450 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.0-linux-arm64.tar.gz: attempt #1 Dec 16 12:14:27.773769 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Dec 16 12:14:28.253739 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.0-linux-arm64.tar.gz" Dec 16 12:14:28.253739 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Dec 16 12:14:28.257151 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Dec 16 12:14:28.257151 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Dec 16 12:14:28.257151 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Dec 16 12:14:28.257151 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 16 12:14:28.257151 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 16 12:14:28.257151 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 16 12:14:28.257151 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 16 12:14:28.267580 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Dec 16 12:14:28.267580 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Dec 16 12:14:28.267580 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Dec 16 12:14:28.267580 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Dec 16 12:14:28.267580 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Dec 16 12:14:28.267580 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.32.4-arm64.raw: attempt #1 Dec 16 12:14:28.372642 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Dec 16 12:14:28.919772 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.32.4-arm64.raw" Dec 16 12:14:28.919772 ignition[1110]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Dec 16 12:14:28.923263 ignition[1110]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 16 12:14:28.926376 ignition[1110]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 16 12:14:28.926376 ignition[1110]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Dec 16 12:14:28.926376 ignition[1110]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Dec 16 12:14:28.929856 ignition[1110]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Dec 16 12:14:28.929856 ignition[1110]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Dec 16 12:14:28.929856 ignition[1110]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Dec 16 12:14:28.929856 ignition[1110]: INFO : files: files passed Dec 16 12:14:28.929856 ignition[1110]: INFO : Ignition finished successfully Dec 16 12:14:28.938148 kernel: audit: type=1130 audit(1765887268.932:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:28.932000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:28.931176 systemd[1]: Finished ignition-files.service - Ignition (files). Dec 16 12:14:28.936261 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Dec 16 12:14:28.939214 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Dec 16 12:14:28.950231 systemd[1]: ignition-quench.service: Deactivated successfully. Dec 16 12:14:28.951064 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Dec 16 12:14:28.956599 kernel: audit: type=1130 audit(1765887268.952:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:28.956626 kernel: audit: type=1131 audit(1765887268.952:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:28.952000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:28.952000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:28.958692 initrd-setup-root-after-ignition[1143]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 16 12:14:28.960106 initrd-setup-root-after-ignition[1143]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Dec 16 12:14:28.961211 initrd-setup-root-after-ignition[1147]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 16 12:14:28.962117 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 16 12:14:28.966128 kernel: audit: type=1130 audit(1765887268.963:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:28.963000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:28.963407 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Dec 16 12:14:28.967983 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Dec 16 12:14:29.001636 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Dec 16 12:14:29.001759 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Dec 16 12:14:29.010815 kernel: audit: type=1130 audit(1765887269.005:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.010845 kernel: audit: type=1131 audit(1765887269.005:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.005000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.005000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.005367 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Dec 16 12:14:29.011665 systemd[1]: Reached target initrd.target - Initrd Default Target. Dec 16 12:14:29.013214 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Dec 16 12:14:29.014192 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Dec 16 12:14:29.048107 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 16 12:14:29.048000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.050317 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Dec 16 12:14:29.053703 kernel: audit: type=1130 audit(1765887269.048:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.077465 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Dec 16 12:14:29.077693 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Dec 16 12:14:29.079584 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 12:14:29.081124 systemd[1]: Stopped target timers.target - Timer Units. Dec 16 12:14:29.082699 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Dec 16 12:14:29.083000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.082828 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 16 12:14:29.087645 kernel: audit: type=1131 audit(1765887269.083:47): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.086693 systemd[1]: Stopped target initrd.target - Initrd Default Target. Dec 16 12:14:29.088464 systemd[1]: Stopped target basic.target - Basic System. Dec 16 12:14:29.089797 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Dec 16 12:14:29.091097 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Dec 16 12:14:29.092670 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Dec 16 12:14:29.094210 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Dec 16 12:14:29.095694 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Dec 16 12:14:29.097090 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Dec 16 12:14:29.098733 systemd[1]: Stopped target sysinit.target - System Initialization. Dec 16 12:14:29.100238 systemd[1]: Stopped target local-fs.target - Local File Systems. Dec 16 12:14:29.101603 systemd[1]: Stopped target swap.target - Swaps. Dec 16 12:14:29.102762 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Dec 16 12:14:29.103000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.102898 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Dec 16 12:14:29.104710 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Dec 16 12:14:29.106249 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 12:14:29.107746 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Dec 16 12:14:29.111190 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 12:14:29.112344 systemd[1]: dracut-initqueue.service: Deactivated successfully. Dec 16 12:14:29.113000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.112467 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Dec 16 12:14:29.114670 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Dec 16 12:14:29.116000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.114793 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 16 12:14:29.117000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.116305 systemd[1]: ignition-files.service: Deactivated successfully. Dec 16 12:14:29.116406 systemd[1]: Stopped ignition-files.service - Ignition (files). Dec 16 12:14:29.118689 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Dec 16 12:14:29.120647 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Dec 16 12:14:29.122036 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Dec 16 12:14:29.123000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.122165 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 12:14:29.125000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.123691 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Dec 16 12:14:29.126000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.123790 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 12:14:29.125129 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Dec 16 12:14:29.125229 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Dec 16 12:14:29.130320 systemd[1]: initrd-cleanup.service: Deactivated successfully. Dec 16 12:14:29.135265 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Dec 16 12:14:29.136000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.136000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.147029 ignition[1167]: INFO : Ignition 2.24.0 Dec 16 12:14:29.147029 ignition[1167]: INFO : Stage: umount Dec 16 12:14:29.147029 ignition[1167]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 12:14:29.147029 ignition[1167]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 12:14:29.150392 ignition[1167]: INFO : umount: umount passed Dec 16 12:14:29.150392 ignition[1167]: INFO : Ignition finished successfully Dec 16 12:14:29.149570 systemd[1]: ignition-mount.service: Deactivated successfully. Dec 16 12:14:29.152000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.151125 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Dec 16 12:14:29.153352 systemd[1]: sysroot-boot.mount: Deactivated successfully. Dec 16 12:14:29.155000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.153723 systemd[1]: ignition-disks.service: Deactivated successfully. Dec 16 12:14:29.157000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.153763 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Dec 16 12:14:29.159000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.155205 systemd[1]: ignition-kargs.service: Deactivated successfully. Dec 16 12:14:29.155251 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Dec 16 12:14:29.162000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.157918 systemd[1]: ignition-fetch.service: Deactivated successfully. Dec 16 12:14:29.157968 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Dec 16 12:14:29.159276 systemd[1]: Stopped target network.target - Network. Dec 16 12:14:29.160954 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Dec 16 12:14:29.161014 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Dec 16 12:14:29.162485 systemd[1]: Stopped target paths.target - Path Units. Dec 16 12:14:29.163766 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Dec 16 12:14:29.168149 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 12:14:29.169085 systemd[1]: Stopped target slices.target - Slice Units. Dec 16 12:14:29.171031 systemd[1]: Stopped target sockets.target - Socket Units. Dec 16 12:14:29.172566 systemd[1]: iscsid.socket: Deactivated successfully. Dec 16 12:14:29.172604 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Dec 16 12:14:29.179000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.174684 systemd[1]: iscsiuio.socket: Deactivated successfully. Dec 16 12:14:29.181000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.174713 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 16 12:14:29.176523 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Dec 16 12:14:29.176543 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Dec 16 12:14:29.178333 systemd[1]: ignition-setup.service: Deactivated successfully. Dec 16 12:14:29.178397 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Dec 16 12:14:29.179857 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Dec 16 12:14:29.179895 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Dec 16 12:14:29.190000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.181659 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Dec 16 12:14:29.182893 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Dec 16 12:14:29.189450 systemd[1]: systemd-resolved.service: Deactivated successfully. Dec 16 12:14:29.189576 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Dec 16 12:14:29.195000 audit: BPF prog-id=6 op=UNLOAD Dec 16 12:14:29.198948 systemd[1]: systemd-networkd.service: Deactivated successfully. Dec 16 12:14:29.199140 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Dec 16 12:14:29.201000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.203819 systemd[1]: Stopped target network-pre.target - Preparation for Network. Dec 16 12:14:29.204819 systemd[1]: systemd-networkd.socket: Deactivated successfully. Dec 16 12:14:29.204866 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Dec 16 12:14:29.207474 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Dec 16 12:14:29.208000 audit: BPF prog-id=9 op=UNLOAD Dec 16 12:14:29.208933 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Dec 16 12:14:29.212000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.208993 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 16 12:14:29.213000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.212236 systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 16 12:14:29.215000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.212278 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Dec 16 12:14:29.213578 systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 16 12:14:29.213616 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Dec 16 12:14:29.215767 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 12:14:29.229617 systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 16 12:14:29.229762 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 12:14:29.231000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.233038 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Dec 16 12:14:29.233125 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Dec 16 12:14:29.234682 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Dec 16 12:14:29.234709 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 12:14:29.237000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.236118 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Dec 16 12:14:29.236172 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Dec 16 12:14:29.240000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.238770 systemd[1]: dracut-cmdline.service: Deactivated successfully. Dec 16 12:14:29.238816 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Dec 16 12:14:29.242000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.240843 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Dec 16 12:14:29.240891 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 16 12:14:29.254037 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Dec 16 12:14:29.254865 systemd[1]: systemd-network-generator.service: Deactivated successfully. Dec 16 12:14:29.256000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.254928 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 12:14:29.258000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.260000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.256695 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Dec 16 12:14:29.256737 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 12:14:29.262000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.258340 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Dec 16 12:14:29.263000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.258381 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 12:14:29.260472 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Dec 16 12:14:29.260525 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 12:14:29.267000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.262499 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 12:14:29.269000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.269000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.262559 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:14:29.264714 systemd[1]: network-cleanup.service: Deactivated successfully. Dec 16 12:14:29.266152 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Dec 16 12:14:29.267866 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Dec 16 12:14:29.267955 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Dec 16 12:14:29.302973 systemd[1]: sysroot-boot.service: Deactivated successfully. Dec 16 12:14:29.303107 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Dec 16 12:14:29.304000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.304803 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Dec 16 12:14:29.305915 systemd[1]: initrd-setup-root.service: Deactivated successfully. Dec 16 12:14:29.307000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:29.305970 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Dec 16 12:14:29.308450 systemd[1]: Starting initrd-switch-root.service - Switch Root... Dec 16 12:14:29.333145 systemd[1]: Switching root. Dec 16 12:14:29.376413 systemd-journald[417]: Journal stopped Dec 16 12:14:30.674801 systemd-journald[417]: Received SIGTERM from PID 1 (systemd). Dec 16 12:14:30.674875 kernel: SELinux: policy capability network_peer_controls=1 Dec 16 12:14:30.674893 kernel: SELinux: policy capability open_perms=1 Dec 16 12:14:30.674909 kernel: SELinux: policy capability extended_socket_class=1 Dec 16 12:14:30.674921 kernel: SELinux: policy capability always_check_network=0 Dec 16 12:14:30.674931 kernel: SELinux: policy capability cgroup_seclabel=1 Dec 16 12:14:30.674944 kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 16 12:14:30.674959 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Dec 16 12:14:30.674972 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Dec 16 12:14:30.674982 kernel: SELinux: policy capability userspace_initial_context=0 Dec 16 12:14:30.674992 systemd[1]: Successfully loaded SELinux policy in 66.420ms. Dec 16 12:14:30.675010 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.976ms. Dec 16 12:14:30.675022 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 16 12:14:30.675035 systemd[1]: Detected virtualization kvm. Dec 16 12:14:30.675046 systemd[1]: Detected architecture arm64. Dec 16 12:14:30.675056 systemd[1]: Detected first boot. Dec 16 12:14:30.675070 systemd[1]: Hostname set to . Dec 16 12:14:30.675097 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Dec 16 12:14:30.675108 zram_generator::config[1212]: No configuration found. Dec 16 12:14:30.675124 kernel: NET: Registered PF_VSOCK protocol family Dec 16 12:14:30.675137 systemd[1]: Populated /etc with preset unit settings. Dec 16 12:14:30.675148 systemd[1]: initrd-switch-root.service: Deactivated successfully. Dec 16 12:14:30.675158 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Dec 16 12:14:30.675169 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Dec 16 12:14:30.675180 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Dec 16 12:14:30.675191 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Dec 16 12:14:30.675203 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Dec 16 12:14:30.675214 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Dec 16 12:14:30.675225 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Dec 16 12:14:30.675236 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Dec 16 12:14:30.675247 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Dec 16 12:14:30.675258 systemd[1]: Created slice user.slice - User and Session Slice. Dec 16 12:14:30.675268 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 12:14:30.675281 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 12:14:30.675292 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Dec 16 12:14:30.675305 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Dec 16 12:14:30.675318 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Dec 16 12:14:30.675328 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 16 12:14:30.675339 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Dec 16 12:14:30.675352 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 12:14:30.675364 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 16 12:14:30.675375 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Dec 16 12:14:30.675386 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Dec 16 12:14:30.675397 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Dec 16 12:14:30.675407 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Dec 16 12:14:30.675419 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 12:14:30.675430 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 16 12:14:30.675441 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Dec 16 12:14:30.675454 systemd[1]: Reached target slices.target - Slice Units. Dec 16 12:14:30.675465 systemd[1]: Reached target swap.target - Swaps. Dec 16 12:14:30.675476 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Dec 16 12:14:30.675487 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Dec 16 12:14:30.675500 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Dec 16 12:14:30.675510 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Dec 16 12:14:30.675521 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Dec 16 12:14:30.675532 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 16 12:14:30.675543 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Dec 16 12:14:30.675554 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Dec 16 12:14:30.675564 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 16 12:14:30.675577 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 12:14:30.675587 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Dec 16 12:14:30.675598 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Dec 16 12:14:30.675609 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Dec 16 12:14:30.675620 systemd[1]: Mounting media.mount - External Media Directory... Dec 16 12:14:30.675630 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Dec 16 12:14:30.675641 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Dec 16 12:14:30.675654 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Dec 16 12:14:30.675665 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Dec 16 12:14:30.675678 systemd[1]: Reached target machines.target - Containers. Dec 16 12:14:30.675688 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Dec 16 12:14:30.675699 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 12:14:30.675710 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 16 12:14:30.675721 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Dec 16 12:14:30.675733 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 12:14:30.675745 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 16 12:14:30.675758 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 12:14:30.675770 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Dec 16 12:14:30.675784 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 12:14:30.675795 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Dec 16 12:14:30.675806 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Dec 16 12:14:30.675817 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Dec 16 12:14:30.675828 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Dec 16 12:14:30.675839 systemd[1]: Stopped systemd-fsck-usr.service. Dec 16 12:14:30.675851 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 12:14:30.675865 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 16 12:14:30.675875 kernel: ACPI: bus type drm_connector registered Dec 16 12:14:30.675886 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 16 12:14:30.675897 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 16 12:14:30.675907 kernel: fuse: init (API version 7.41) Dec 16 12:14:30.675918 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Dec 16 12:14:30.675930 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Dec 16 12:14:30.675940 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 16 12:14:30.675951 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Dec 16 12:14:30.675962 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Dec 16 12:14:30.675973 systemd[1]: Mounted media.mount - External Media Directory. Dec 16 12:14:30.675984 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Dec 16 12:14:30.675994 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Dec 16 12:14:30.676008 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Dec 16 12:14:30.676019 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 12:14:30.676052 systemd-journald[1286]: Collecting audit messages is enabled. Dec 16 12:14:30.676085 systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 16 12:14:30.676098 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Dec 16 12:14:30.676120 systemd-journald[1286]: Journal started Dec 16 12:14:30.676142 systemd-journald[1286]: Runtime Journal (/run/log/journal/595290ea0e29448aa7105217e31b2dd1) is 8M, max 319.5M, 311.5M free. Dec 16 12:14:30.528000 audit[1]: EVENT_LISTENER pid=1 uid=0 auid=4294967295 tty=(none) ses=4294967295 subj=system_u:system_r:kernel_t:s0 comm="systemd" exe="/usr/lib/systemd/systemd" nl-mcgrp=1 op=connect res=1 Dec 16 12:14:30.614000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.616000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.618000 audit: BPF prog-id=14 op=UNLOAD Dec 16 12:14:30.618000 audit: BPF prog-id=13 op=UNLOAD Dec 16 12:14:30.625000 audit: BPF prog-id=15 op=LOAD Dec 16 12:14:30.625000 audit: BPF prog-id=16 op=LOAD Dec 16 12:14:30.625000 audit: BPF prog-id=17 op=LOAD Dec 16 12:14:30.672000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Dec 16 12:14:30.672000 audit[1286]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=60 a0=3 a1=ffffd2c86f40 a2=4000 a3=0 items=0 ppid=1 pid=1286 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:30.672000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Dec 16 12:14:30.674000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.450254 systemd[1]: Queued start job for default target multi-user.target. Dec 16 12:14:30.466594 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Dec 16 12:14:30.467108 systemd[1]: systemd-journald.service: Deactivated successfully. Dec 16 12:14:30.677000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.677000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.679283 systemd[1]: Started systemd-journald.service - Journal Service. Dec 16 12:14:30.679000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.681196 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Dec 16 12:14:30.682000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.682380 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 12:14:30.682555 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 12:14:30.683000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.683000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.683744 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 16 12:14:30.685126 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 16 12:14:30.685000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.685000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.686235 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 12:14:30.686402 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 12:14:30.687000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.687000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.687784 systemd[1]: modprobe@fuse.service: Deactivated successfully. Dec 16 12:14:30.687958 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Dec 16 12:14:30.688000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.688000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.689269 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 12:14:30.689426 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 12:14:30.690000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.690000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.690674 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 16 12:14:30.691000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.692092 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 12:14:30.692000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.694132 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Dec 16 12:14:30.695000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.695736 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Dec 16 12:14:30.696000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.707776 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 16 12:14:30.709135 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Dec 16 12:14:30.711230 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Dec 16 12:14:30.713143 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Dec 16 12:14:30.714027 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Dec 16 12:14:30.714056 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 16 12:14:30.715767 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Dec 16 12:14:30.718183 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 12:14:30.718310 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Dec 16 12:14:30.726040 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Dec 16 12:14:30.727904 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Dec 16 12:14:30.728943 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 16 12:14:30.732257 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Dec 16 12:14:30.733311 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 16 12:14:30.734421 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 16 12:14:30.737299 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Dec 16 12:14:30.739294 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 16 12:14:30.740521 systemd-journald[1286]: Time spent on flushing to /var/log/journal/595290ea0e29448aa7105217e31b2dd1 is 30.378ms for 1814 entries. Dec 16 12:14:30.740521 systemd-journald[1286]: System Journal (/var/log/journal/595290ea0e29448aa7105217e31b2dd1) is 8M, max 588.1M, 580.1M free. Dec 16 12:14:30.781215 systemd-journald[1286]: Received client request to flush runtime journal. Dec 16 12:14:30.781256 kernel: loop1: detected capacity change from 0 to 1648 Dec 16 12:14:30.754000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.755000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.767000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.743709 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Dec 16 12:14:30.745010 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Dec 16 12:14:30.753014 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Dec 16 12:14:30.754521 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 12:14:30.756340 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Dec 16 12:14:30.762271 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Dec 16 12:14:30.766257 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 16 12:14:30.779121 systemd-tmpfiles[1334]: ACLs are not supported, ignoring. Dec 16 12:14:30.779132 systemd-tmpfiles[1334]: ACLs are not supported, ignoring. Dec 16 12:14:30.783172 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Dec 16 12:14:30.785000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.785813 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 12:14:30.786000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.790321 systemd[1]: Starting systemd-sysusers.service - Create System Users... Dec 16 12:14:30.793122 kernel: loop2: detected capacity change from 0 to 45344 Dec 16 12:14:30.806714 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Dec 16 12:14:30.807000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.841039 systemd[1]: Finished systemd-sysusers.service - Create System Users. Dec 16 12:14:30.841000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.843000 audit: BPF prog-id=18 op=LOAD Dec 16 12:14:30.843000 audit: BPF prog-id=19 op=LOAD Dec 16 12:14:30.844000 audit: BPF prog-id=20 op=LOAD Dec 16 12:14:30.847097 kernel: loop3: detected capacity change from 0 to 207008 Dec 16 12:14:30.846000 audit: BPF prog-id=21 op=LOAD Dec 16 12:14:30.844905 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Dec 16 12:14:30.847514 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 16 12:14:30.852249 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 16 12:14:30.854000 audit: BPF prog-id=22 op=LOAD Dec 16 12:14:30.854000 audit: BPF prog-id=23 op=LOAD Dec 16 12:14:30.864000 audit: BPF prog-id=24 op=LOAD Dec 16 12:14:30.865112 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Dec 16 12:14:30.866000 audit: BPF prog-id=25 op=LOAD Dec 16 12:14:30.866000 audit: BPF prog-id=26 op=LOAD Dec 16 12:14:30.866000 audit: BPF prog-id=27 op=LOAD Dec 16 12:14:30.867909 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Dec 16 12:14:30.876329 systemd-tmpfiles[1355]: ACLs are not supported, ignoring. Dec 16 12:14:30.876345 systemd-tmpfiles[1355]: ACLs are not supported, ignoring. Dec 16 12:14:30.886249 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 12:14:30.887000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.891104 kernel: loop4: detected capacity change from 0 to 100192 Dec 16 12:14:30.912510 systemd-nsresourced[1357]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Dec 16 12:14:30.912995 systemd[1]: Started systemd-userdbd.service - User Database Manager. Dec 16 12:14:30.914000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.914553 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Dec 16 12:14:30.915000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.965113 kernel: loop5: detected capacity change from 0 to 1648 Dec 16 12:14:30.969215 systemd-oomd[1353]: No swap; memory pressure usage will be degraded Dec 16 12:14:30.969662 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Dec 16 12:14:30.970000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.975100 kernel: loop6: detected capacity change from 0 to 45344 Dec 16 12:14:30.981798 systemd-resolved[1354]: Positive Trust Anchors: Dec 16 12:14:30.982098 systemd-resolved[1354]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 16 12:14:30.982106 systemd-resolved[1354]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Dec 16 12:14:30.982139 systemd-resolved[1354]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 16 12:14:30.988092 kernel: loop7: detected capacity change from 0 to 207008 Dec 16 12:14:30.991370 systemd-resolved[1354]: Using system hostname 'ci-4547-0-0-0-5b424f63c8'. Dec 16 12:14:30.992769 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 16 12:14:30.993000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:30.993837 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 16 12:14:31.017151 kernel: loop1: detected capacity change from 0 to 100192 Dec 16 12:14:31.036894 (sd-merge)[1376]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-stackit.raw'. Dec 16 12:14:31.039869 (sd-merge)[1376]: Merged extensions into '/usr'. Dec 16 12:14:31.044226 systemd[1]: Reload requested from client PID 1333 ('systemd-sysext') (unit systemd-sysext.service)... Dec 16 12:14:31.044250 systemd[1]: Reloading... Dec 16 12:14:31.104119 zram_generator::config[1406]: No configuration found. Dec 16 12:14:31.265786 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Dec 16 12:14:31.266009 systemd[1]: Reloading finished in 221 ms. Dec 16 12:14:31.302473 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Dec 16 12:14:31.303000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:31.305104 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Dec 16 12:14:31.305000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:31.329626 systemd[1]: Starting ensure-sysext.service... Dec 16 12:14:31.331337 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 16 12:14:31.332000 audit: BPF prog-id=8 op=UNLOAD Dec 16 12:14:31.332000 audit: BPF prog-id=7 op=UNLOAD Dec 16 12:14:31.332000 audit: BPF prog-id=28 op=LOAD Dec 16 12:14:31.332000 audit: BPF prog-id=29 op=LOAD Dec 16 12:14:31.333579 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 12:14:31.335000 audit: BPF prog-id=30 op=LOAD Dec 16 12:14:31.335000 audit: BPF prog-id=18 op=UNLOAD Dec 16 12:14:31.335000 audit: BPF prog-id=31 op=LOAD Dec 16 12:14:31.335000 audit: BPF prog-id=32 op=LOAD Dec 16 12:14:31.335000 audit: BPF prog-id=19 op=UNLOAD Dec 16 12:14:31.335000 audit: BPF prog-id=20 op=UNLOAD Dec 16 12:14:31.336000 audit: BPF prog-id=33 op=LOAD Dec 16 12:14:31.336000 audit: BPF prog-id=15 op=UNLOAD Dec 16 12:14:31.336000 audit: BPF prog-id=34 op=LOAD Dec 16 12:14:31.336000 audit: BPF prog-id=35 op=LOAD Dec 16 12:14:31.336000 audit: BPF prog-id=16 op=UNLOAD Dec 16 12:14:31.336000 audit: BPF prog-id=17 op=UNLOAD Dec 16 12:14:31.337000 audit: BPF prog-id=36 op=LOAD Dec 16 12:14:31.337000 audit: BPF prog-id=22 op=UNLOAD Dec 16 12:14:31.337000 audit: BPF prog-id=37 op=LOAD Dec 16 12:14:31.337000 audit: BPF prog-id=38 op=LOAD Dec 16 12:14:31.337000 audit: BPF prog-id=23 op=UNLOAD Dec 16 12:14:31.337000 audit: BPF prog-id=24 op=UNLOAD Dec 16 12:14:31.337000 audit: BPF prog-id=39 op=LOAD Dec 16 12:14:31.337000 audit: BPF prog-id=21 op=UNLOAD Dec 16 12:14:31.338000 audit: BPF prog-id=40 op=LOAD Dec 16 12:14:31.338000 audit: BPF prog-id=25 op=UNLOAD Dec 16 12:14:31.338000 audit: BPF prog-id=41 op=LOAD Dec 16 12:14:31.338000 audit: BPF prog-id=42 op=LOAD Dec 16 12:14:31.338000 audit: BPF prog-id=26 op=UNLOAD Dec 16 12:14:31.338000 audit: BPF prog-id=27 op=UNLOAD Dec 16 12:14:31.344875 systemd[1]: Reload requested from client PID 1443 ('systemctl') (unit ensure-sysext.service)... Dec 16 12:14:31.344893 systemd[1]: Reloading... Dec 16 12:14:31.346573 systemd-tmpfiles[1444]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Dec 16 12:14:31.346617 systemd-tmpfiles[1444]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Dec 16 12:14:31.347647 systemd-tmpfiles[1444]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Dec 16 12:14:31.348602 systemd-tmpfiles[1444]: ACLs are not supported, ignoring. Dec 16 12:14:31.348655 systemd-tmpfiles[1444]: ACLs are not supported, ignoring. Dec 16 12:14:31.356227 systemd-tmpfiles[1444]: Detected autofs mount point /boot during canonicalization of boot. Dec 16 12:14:31.356242 systemd-tmpfiles[1444]: Skipping /boot Dec 16 12:14:31.357421 systemd-udevd[1445]: Using default interface naming scheme 'v257'. Dec 16 12:14:31.363136 systemd-tmpfiles[1444]: Detected autofs mount point /boot during canonicalization of boot. Dec 16 12:14:31.363147 systemd-tmpfiles[1444]: Skipping /boot Dec 16 12:14:31.400687 zram_generator::config[1477]: No configuration found. Dec 16 12:14:31.495130 kernel: mousedev: PS/2 mouse device common for all mice Dec 16 12:14:31.588095 kernel: [drm] pci: virtio-gpu-pci detected at 0000:06:00.0 Dec 16 12:14:31.588183 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Dec 16 12:14:31.588200 kernel: [drm] features: -context_init Dec 16 12:14:31.588214 kernel: [drm] number of scanouts: 1 Dec 16 12:14:31.588230 kernel: [drm] number of cap sets: 0 Dec 16 12:14:31.588243 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:06:00.0 on minor 0 Dec 16 12:14:31.595096 kernel: Console: switching to colour frame buffer device 160x50 Dec 16 12:14:31.595192 kernel: virtio-pci 0000:06:00.0: [drm] fb0: virtio_gpudrmfb frame buffer device Dec 16 12:14:31.610741 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Dec 16 12:14:31.611021 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Dec 16 12:14:31.612785 systemd[1]: Reloading finished in 267 ms. Dec 16 12:14:31.627000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:31.624132 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 12:14:31.628118 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 12:14:31.629000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:31.632000 audit: BPF prog-id=43 op=LOAD Dec 16 12:14:31.632000 audit: BPF prog-id=39 op=UNLOAD Dec 16 12:14:31.633000 audit: BPF prog-id=44 op=LOAD Dec 16 12:14:31.633000 audit: BPF prog-id=45 op=LOAD Dec 16 12:14:31.633000 audit: BPF prog-id=28 op=UNLOAD Dec 16 12:14:31.633000 audit: BPF prog-id=29 op=UNLOAD Dec 16 12:14:31.634000 audit: BPF prog-id=46 op=LOAD Dec 16 12:14:31.634000 audit: BPF prog-id=33 op=UNLOAD Dec 16 12:14:31.634000 audit: BPF prog-id=47 op=LOAD Dec 16 12:14:31.634000 audit: BPF prog-id=48 op=LOAD Dec 16 12:14:31.634000 audit: BPF prog-id=34 op=UNLOAD Dec 16 12:14:31.634000 audit: BPF prog-id=35 op=UNLOAD Dec 16 12:14:31.634000 audit: BPF prog-id=49 op=LOAD Dec 16 12:14:31.635000 audit: BPF prog-id=30 op=UNLOAD Dec 16 12:14:31.635000 audit: BPF prog-id=50 op=LOAD Dec 16 12:14:31.635000 audit: BPF prog-id=51 op=LOAD Dec 16 12:14:31.635000 audit: BPF prog-id=31 op=UNLOAD Dec 16 12:14:31.635000 audit: BPF prog-id=32 op=UNLOAD Dec 16 12:14:31.636000 audit: BPF prog-id=52 op=LOAD Dec 16 12:14:31.636000 audit: BPF prog-id=36 op=UNLOAD Dec 16 12:14:31.636000 audit: BPF prog-id=53 op=LOAD Dec 16 12:14:31.636000 audit: BPF prog-id=54 op=LOAD Dec 16 12:14:31.636000 audit: BPF prog-id=37 op=UNLOAD Dec 16 12:14:31.636000 audit: BPF prog-id=38 op=UNLOAD Dec 16 12:14:31.637000 audit: BPF prog-id=55 op=LOAD Dec 16 12:14:31.641000 audit: BPF prog-id=40 op=UNLOAD Dec 16 12:14:31.641000 audit: BPF prog-id=56 op=LOAD Dec 16 12:14:31.641000 audit: BPF prog-id=57 op=LOAD Dec 16 12:14:31.641000 audit: BPF prog-id=41 op=UNLOAD Dec 16 12:14:31.641000 audit: BPF prog-id=42 op=UNLOAD Dec 16 12:14:31.665404 systemd[1]: Finished ensure-sysext.service. Dec 16 12:14:31.666000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:31.683780 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 12:14:31.686780 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Dec 16 12:14:31.687856 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 12:14:31.698222 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 12:14:31.700422 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 16 12:14:31.702220 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 12:14:31.706278 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 12:14:31.708357 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Dec 16 12:14:31.709310 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 12:14:31.709423 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Dec 16 12:14:31.711062 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Dec 16 12:14:31.713283 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Dec 16 12:14:31.714441 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 12:14:31.716515 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Dec 16 12:14:31.719000 audit: BPF prog-id=58 op=LOAD Dec 16 12:14:31.720162 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 16 12:14:31.723166 systemd[1]: Reached target time-set.target - System Time Set. Dec 16 12:14:31.726129 kernel: pps_core: LinuxPPS API ver. 1 registered Dec 16 12:14:31.726193 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Dec 16 12:14:31.726501 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Dec 16 12:14:31.729054 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:14:31.732767 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 12:14:31.734102 kernel: PTP clock support registered Dec 16 12:14:31.734122 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 12:14:31.735557 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 16 12:14:31.735744 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 16 12:14:31.735000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:31.735000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:31.736000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:31.736000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:31.737353 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 12:14:31.737566 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 12:14:31.740656 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 12:14:31.740839 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 12:14:31.740000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:31.740000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:31.742401 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Dec 16 12:14:31.742599 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Dec 16 12:14:31.742000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:31.742000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:31.743000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:31.743000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:31.743995 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Dec 16 12:14:31.744000 audit[1586]: SYSTEM_BOOT pid=1586 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Dec 16 12:14:31.745000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:31.752834 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 16 12:14:31.752985 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 16 12:14:31.756507 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Dec 16 12:14:31.757000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:31.769258 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Dec 16 12:14:31.770000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:31.775000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Dec 16 12:14:31.775000 audit[1612]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffc2360630 a2=420 a3=0 items=0 ppid=1565 pid=1612 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:31.775000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Dec 16 12:14:31.777264 augenrules[1612]: No rules Dec 16 12:14:31.777611 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 12:14:31.779173 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 12:14:31.808162 systemd-networkd[1582]: lo: Link UP Dec 16 12:14:31.808175 systemd-networkd[1582]: lo: Gained carrier Dec 16 12:14:31.809311 systemd-networkd[1582]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:14:31.809321 systemd-networkd[1582]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 12:14:31.809382 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 16 12:14:31.810257 systemd-networkd[1582]: eth0: Link UP Dec 16 12:14:31.810441 systemd[1]: Reached target network.target - Network. Dec 16 12:14:31.810566 systemd-networkd[1582]: eth0: Gained carrier Dec 16 12:14:31.810582 systemd-networkd[1582]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:14:31.813283 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Dec 16 12:14:31.818132 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Dec 16 12:14:31.819696 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Dec 16 12:14:31.823374 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Dec 16 12:14:31.830190 systemd-networkd[1582]: eth0: DHCPv4 address 10.0.21.180/25, gateway 10.0.21.129 acquired from 10.0.21.129 Dec 16 12:14:31.830977 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:14:31.841892 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Dec 16 12:14:32.295813 ldconfig[1578]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Dec 16 12:14:32.300491 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Dec 16 12:14:32.302838 systemd[1]: Starting systemd-update-done.service - Update is Completed... Dec 16 12:14:32.321525 systemd[1]: Finished systemd-update-done.service - Update is Completed. Dec 16 12:14:32.322704 systemd[1]: Reached target sysinit.target - System Initialization. Dec 16 12:14:32.323680 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Dec 16 12:14:32.324662 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Dec 16 12:14:32.325842 systemd[1]: Started logrotate.timer - Daily rotation of log files. Dec 16 12:14:32.326842 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Dec 16 12:14:32.327886 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Dec 16 12:14:32.328993 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Dec 16 12:14:32.329942 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Dec 16 12:14:32.331132 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Dec 16 12:14:32.331167 systemd[1]: Reached target paths.target - Path Units. Dec 16 12:14:32.331833 systemd[1]: Reached target timers.target - Timer Units. Dec 16 12:14:32.333405 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Dec 16 12:14:32.335610 systemd[1]: Starting docker.socket - Docker Socket for the API... Dec 16 12:14:32.338942 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Dec 16 12:14:32.340165 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Dec 16 12:14:32.341064 systemd[1]: Reached target ssh-access.target - SSH Access Available. Dec 16 12:14:32.343956 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Dec 16 12:14:32.345119 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Dec 16 12:14:32.346592 systemd[1]: Listening on docker.socket - Docker Socket for the API. Dec 16 12:14:32.347521 systemd[1]: Reached target sockets.target - Socket Units. Dec 16 12:14:32.348254 systemd[1]: Reached target basic.target - Basic System. Dec 16 12:14:32.348964 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Dec 16 12:14:32.348997 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Dec 16 12:14:32.351384 systemd[1]: Starting chronyd.service - NTP client/server... Dec 16 12:14:32.352980 systemd[1]: Starting containerd.service - containerd container runtime... Dec 16 12:14:32.355211 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Dec 16 12:14:32.358251 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Dec 16 12:14:32.359868 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Dec 16 12:14:32.363093 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:14:32.364318 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Dec 16 12:14:32.367330 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Dec 16 12:14:32.368151 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Dec 16 12:14:32.379327 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Dec 16 12:14:32.381003 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Dec 16 12:14:32.383264 jq[1636]: false Dec 16 12:14:32.384098 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Dec 16 12:14:32.386041 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Dec 16 12:14:32.389244 systemd[1]: Starting systemd-logind.service - User Login Management... Dec 16 12:14:32.390092 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Dec 16 12:14:32.390493 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Dec 16 12:14:32.390821 chronyd[1630]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Dec 16 12:14:32.392117 chronyd[1630]: Loaded seccomp filter (level 2) Dec 16 12:14:32.393240 systemd[1]: Starting update-engine.service - Update Engine... Dec 16 12:14:32.396249 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Dec 16 12:14:32.397975 systemd[1]: Started chronyd.service - NTP client/server. Dec 16 12:14:32.402110 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Dec 16 12:14:32.402340 extend-filesystems[1638]: Found /dev/vda6 Dec 16 12:14:32.404447 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Dec 16 12:14:32.404671 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Dec 16 12:14:32.405475 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Dec 16 12:14:32.407122 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Dec 16 12:14:32.410514 jq[1649]: true Dec 16 12:14:32.410757 extend-filesystems[1638]: Found /dev/vda9 Dec 16 12:14:32.414064 extend-filesystems[1638]: Checking size of /dev/vda9 Dec 16 12:14:32.417796 systemd[1]: motdgen.service: Deactivated successfully. Dec 16 12:14:32.418026 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Dec 16 12:14:32.427862 jq[1670]: true Dec 16 12:14:32.428105 extend-filesystems[1638]: Resized partition /dev/vda9 Dec 16 12:14:32.435975 extend-filesystems[1685]: resize2fs 1.47.3 (8-Jul-2025) Dec 16 12:14:32.444123 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 11516923 blocks Dec 16 12:14:32.446477 tar[1657]: linux-arm64/LICENSE Dec 16 12:14:32.446477 tar[1657]: linux-arm64/helm Dec 16 12:14:32.452030 update_engine[1647]: I20251216 12:14:32.451700 1647 main.cc:92] Flatcar Update Engine starting Dec 16 12:14:32.476856 dbus-daemon[1633]: [system] SELinux support is enabled Dec 16 12:14:32.477305 systemd[1]: Started dbus.service - D-Bus System Message Bus. Dec 16 12:14:32.480929 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Dec 16 12:14:32.480967 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Dec 16 12:14:32.482317 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Dec 16 12:14:32.482339 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Dec 16 12:14:32.484600 update_engine[1647]: I20251216 12:14:32.484454 1647 update_check_scheduler.cc:74] Next update check in 11m9s Dec 16 12:14:32.486026 systemd[1]: Started update-engine.service - Update Engine. Dec 16 12:14:32.488570 systemd[1]: Started locksmithd.service - Cluster reboot manager. Dec 16 12:14:32.528636 systemd-logind[1646]: New seat seat0. Dec 16 12:14:32.571392 locksmithd[1698]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Dec 16 12:14:32.599601 systemd-logind[1646]: Watching system buttons on /dev/input/event0 (Power Button) Dec 16 12:14:32.599617 systemd-logind[1646]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Dec 16 12:14:32.599924 systemd[1]: Started systemd-logind.service - User Login Management. Dec 16 12:14:32.635495 containerd[1680]: time="2025-12-16T12:14:32Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Dec 16 12:14:32.767206 containerd[1680]: time="2025-12-16T12:14:32.767141520Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Dec 16 12:14:32.777365 containerd[1680]: time="2025-12-16T12:14:32.777276520Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="8.8µs" Dec 16 12:14:32.777365 containerd[1680]: time="2025-12-16T12:14:32.777317640Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Dec 16 12:14:32.777365 containerd[1680]: time="2025-12-16T12:14:32.777358880Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Dec 16 12:14:32.777365 containerd[1680]: time="2025-12-16T12:14:32.777369960Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Dec 16 12:14:32.777685 containerd[1680]: time="2025-12-16T12:14:32.777522440Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Dec 16 12:14:32.777685 containerd[1680]: time="2025-12-16T12:14:32.777540480Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 16 12:14:32.777685 containerd[1680]: time="2025-12-16T12:14:32.777590480Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 16 12:14:32.777685 containerd[1680]: time="2025-12-16T12:14:32.777601720Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 16 12:14:32.777946 containerd[1680]: time="2025-12-16T12:14:32.777866200Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 16 12:14:32.777946 containerd[1680]: time="2025-12-16T12:14:32.777887360Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 16 12:14:32.777946 containerd[1680]: time="2025-12-16T12:14:32.777898280Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 16 12:14:32.777946 containerd[1680]: time="2025-12-16T12:14:32.777906280Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Dec 16 12:14:32.778327 containerd[1680]: time="2025-12-16T12:14:32.778276840Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Dec 16 12:14:32.778327 containerd[1680]: time="2025-12-16T12:14:32.778302000Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Dec 16 12:14:32.779110 containerd[1680]: time="2025-12-16T12:14:32.778491640Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Dec 16 12:14:32.779110 containerd[1680]: time="2025-12-16T12:14:32.778678080Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 16 12:14:32.779110 containerd[1680]: time="2025-12-16T12:14:32.778706960Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 16 12:14:32.779110 containerd[1680]: time="2025-12-16T12:14:32.778715680Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Dec 16 12:14:32.780361 containerd[1680]: time="2025-12-16T12:14:32.780266200Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Dec 16 12:14:32.780589 containerd[1680]: time="2025-12-16T12:14:32.780549640Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Dec 16 12:14:32.806005 containerd[1680]: time="2025-12-16T12:14:32.805915720Z" level=info msg="metadata content store policy set" policy=shared Dec 16 12:14:32.810096 bash[1702]: Updated "/home/core/.ssh/authorized_keys" Dec 16 12:14:32.813135 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Dec 16 12:14:32.816163 systemd[1]: Starting sshkeys.service... Dec 16 12:14:32.838947 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Dec 16 12:14:32.841870 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Dec 16 12:14:32.870468 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:14:32.911097 containerd[1680]: time="2025-12-16T12:14:32.910745360Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Dec 16 12:14:32.911097 containerd[1680]: time="2025-12-16T12:14:32.910827280Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Dec 16 12:14:32.911097 containerd[1680]: time="2025-12-16T12:14:32.910919960Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Dec 16 12:14:32.911097 containerd[1680]: time="2025-12-16T12:14:32.910933720Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Dec 16 12:14:32.911097 containerd[1680]: time="2025-12-16T12:14:32.910947320Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Dec 16 12:14:32.911097 containerd[1680]: time="2025-12-16T12:14:32.910960040Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Dec 16 12:14:32.911097 containerd[1680]: time="2025-12-16T12:14:32.910972480Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Dec 16 12:14:32.911097 containerd[1680]: time="2025-12-16T12:14:32.910982800Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Dec 16 12:14:32.911097 containerd[1680]: time="2025-12-16T12:14:32.910995440Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Dec 16 12:14:32.911097 containerd[1680]: time="2025-12-16T12:14:32.911007680Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Dec 16 12:14:32.911097 containerd[1680]: time="2025-12-16T12:14:32.911019440Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Dec 16 12:14:32.911097 containerd[1680]: time="2025-12-16T12:14:32.911029920Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Dec 16 12:14:32.911097 containerd[1680]: time="2025-12-16T12:14:32.911039880Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Dec 16 12:14:32.911097 containerd[1680]: time="2025-12-16T12:14:32.911064440Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Dec 16 12:14:32.911430 containerd[1680]: time="2025-12-16T12:14:32.911230840Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Dec 16 12:14:32.911430 containerd[1680]: time="2025-12-16T12:14:32.911250600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Dec 16 12:14:32.911430 containerd[1680]: time="2025-12-16T12:14:32.911264560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Dec 16 12:14:32.911430 containerd[1680]: time="2025-12-16T12:14:32.911274440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Dec 16 12:14:32.911430 containerd[1680]: time="2025-12-16T12:14:32.911285480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Dec 16 12:14:32.911430 containerd[1680]: time="2025-12-16T12:14:32.911300440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Dec 16 12:14:32.911430 containerd[1680]: time="2025-12-16T12:14:32.911311480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Dec 16 12:14:32.911430 containerd[1680]: time="2025-12-16T12:14:32.911321600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Dec 16 12:14:32.911430 containerd[1680]: time="2025-12-16T12:14:32.911331440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Dec 16 12:14:32.911430 containerd[1680]: time="2025-12-16T12:14:32.911345320Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Dec 16 12:14:32.911430 containerd[1680]: time="2025-12-16T12:14:32.911357920Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Dec 16 12:14:32.911430 containerd[1680]: time="2025-12-16T12:14:32.911382760Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Dec 16 12:14:32.911625 containerd[1680]: time="2025-12-16T12:14:32.911433280Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Dec 16 12:14:32.911625 containerd[1680]: time="2025-12-16T12:14:32.911452240Z" level=info msg="Start snapshots syncer" Dec 16 12:14:32.911625 containerd[1680]: time="2025-12-16T12:14:32.911475160Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Dec 16 12:14:32.912109 containerd[1680]: time="2025-12-16T12:14:32.911691560Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Dec 16 12:14:32.912109 containerd[1680]: time="2025-12-16T12:14:32.911745600Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Dec 16 12:14:32.912251 containerd[1680]: time="2025-12-16T12:14:32.911786760Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Dec 16 12:14:32.912251 containerd[1680]: time="2025-12-16T12:14:32.911875240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Dec 16 12:14:32.912251 containerd[1680]: time="2025-12-16T12:14:32.911894240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Dec 16 12:14:32.912251 containerd[1680]: time="2025-12-16T12:14:32.911904040Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Dec 16 12:14:32.912251 containerd[1680]: time="2025-12-16T12:14:32.911913680Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Dec 16 12:14:32.912251 containerd[1680]: time="2025-12-16T12:14:32.911925360Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Dec 16 12:14:32.912251 containerd[1680]: time="2025-12-16T12:14:32.911935280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Dec 16 12:14:32.912251 containerd[1680]: time="2025-12-16T12:14:32.911955320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Dec 16 12:14:32.912251 containerd[1680]: time="2025-12-16T12:14:32.911966120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Dec 16 12:14:32.912251 containerd[1680]: time="2025-12-16T12:14:32.911978240Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Dec 16 12:14:32.912251 containerd[1680]: time="2025-12-16T12:14:32.912006920Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 16 12:14:32.912251 containerd[1680]: time="2025-12-16T12:14:32.912019160Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 16 12:14:32.912251 containerd[1680]: time="2025-12-16T12:14:32.912028320Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 16 12:14:32.912463 containerd[1680]: time="2025-12-16T12:14:32.912038160Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 16 12:14:32.912463 containerd[1680]: time="2025-12-16T12:14:32.912046080Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Dec 16 12:14:32.912463 containerd[1680]: time="2025-12-16T12:14:32.912057120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Dec 16 12:14:32.912463 containerd[1680]: time="2025-12-16T12:14:32.912067200Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Dec 16 12:14:32.912463 containerd[1680]: time="2025-12-16T12:14:32.912189920Z" level=info msg="runtime interface created" Dec 16 12:14:32.912463 containerd[1680]: time="2025-12-16T12:14:32.912196040Z" level=info msg="created NRI interface" Dec 16 12:14:32.912463 containerd[1680]: time="2025-12-16T12:14:32.912211280Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Dec 16 12:14:32.912463 containerd[1680]: time="2025-12-16T12:14:32.912222600Z" level=info msg="Connect containerd service" Dec 16 12:14:32.912463 containerd[1680]: time="2025-12-16T12:14:32.912244920Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Dec 16 12:14:32.913087 containerd[1680]: time="2025-12-16T12:14:32.912856160Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 16 12:14:32.993740 containerd[1680]: time="2025-12-16T12:14:32.993665960Z" level=info msg="Start subscribing containerd event" Dec 16 12:14:32.993740 containerd[1680]: time="2025-12-16T12:14:32.993733720Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Dec 16 12:14:32.993740 containerd[1680]: time="2025-12-16T12:14:32.993745400Z" level=info msg="Start recovering state" Dec 16 12:14:32.993876 containerd[1680]: time="2025-12-16T12:14:32.993779400Z" level=info msg=serving... address=/run/containerd/containerd.sock Dec 16 12:14:32.993876 containerd[1680]: time="2025-12-16T12:14:32.993836760Z" level=info msg="Start event monitor" Dec 16 12:14:32.993876 containerd[1680]: time="2025-12-16T12:14:32.993850240Z" level=info msg="Start cni network conf syncer for default" Dec 16 12:14:32.993876 containerd[1680]: time="2025-12-16T12:14:32.993856560Z" level=info msg="Start streaming server" Dec 16 12:14:32.993876 containerd[1680]: time="2025-12-16T12:14:32.993864840Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Dec 16 12:14:32.993876 containerd[1680]: time="2025-12-16T12:14:32.993872160Z" level=info msg="runtime interface starting up..." Dec 16 12:14:32.993876 containerd[1680]: time="2025-12-16T12:14:32.993877960Z" level=info msg="starting plugins..." Dec 16 12:14:32.994007 containerd[1680]: time="2025-12-16T12:14:32.993892440Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Dec 16 12:14:32.994007 containerd[1680]: time="2025-12-16T12:14:32.994002600Z" level=info msg="containerd successfully booted in 0.358861s" Dec 16 12:14:32.994188 systemd[1]: Started containerd.service - containerd container runtime. Dec 16 12:14:33.129898 tar[1657]: linux-arm64/README.md Dec 16 12:14:33.150127 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Dec 16 12:14:33.290269 systemd-networkd[1582]: eth0: Gained IPv6LL Dec 16 12:14:33.292972 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Dec 16 12:14:33.294674 systemd[1]: Reached target network-online.target - Network is Online. Dec 16 12:14:33.296828 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:14:33.298844 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Dec 16 12:14:33.306158 kernel: EXT4-fs (vda9): resized filesystem to 11516923 Dec 16 12:14:33.449246 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:14:33.459542 extend-filesystems[1685]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Dec 16 12:14:33.459542 extend-filesystems[1685]: old_desc_blocks = 1, new_desc_blocks = 6 Dec 16 12:14:33.459542 extend-filesystems[1685]: The filesystem on /dev/vda9 is now 11516923 (4k) blocks long. Dec 16 12:14:33.458733 systemd[1]: extend-filesystems.service: Deactivated successfully. Dec 16 12:14:33.464267 extend-filesystems[1638]: Resized filesystem in /dev/vda9 Dec 16 12:14:33.459035 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Dec 16 12:14:33.471155 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Dec 16 12:14:33.754203 sshd_keygen[1669]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Dec 16 12:14:33.773569 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Dec 16 12:14:33.777273 systemd[1]: Starting issuegen.service - Generate /run/issue... Dec 16 12:14:33.793262 systemd[1]: issuegen.service: Deactivated successfully. Dec 16 12:14:33.795152 systemd[1]: Finished issuegen.service - Generate /run/issue. Dec 16 12:14:33.798202 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Dec 16 12:14:33.817846 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Dec 16 12:14:33.820761 systemd[1]: Started getty@tty1.service - Getty on tty1. Dec 16 12:14:33.822985 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Dec 16 12:14:33.824454 systemd[1]: Reached target getty.target - Login Prompts. Dec 16 12:14:33.878118 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:14:34.395336 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:14:34.399361 (kubelet)[1776]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:14:34.925725 kubelet[1776]: E1216 12:14:34.925613 1776 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:14:34.927860 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:14:34.927988 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:14:34.928513 systemd[1]: kubelet.service: Consumed 772ms CPU time, 257.6M memory peak. Dec 16 12:14:35.387145 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:14:35.891101 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:14:39.394116 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:14:39.400512 coreos-metadata[1632]: Dec 16 12:14:39.400 WARN failed to locate config-drive, using the metadata service API instead Dec 16 12:14:39.417531 coreos-metadata[1632]: Dec 16 12:14:39.417 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Dec 16 12:14:39.869525 coreos-metadata[1632]: Dec 16 12:14:39.869 INFO Fetch successful Dec 16 12:14:39.869692 coreos-metadata[1632]: Dec 16 12:14:39.869 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Dec 16 12:14:39.903100 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:14:39.912473 coreos-metadata[1717]: Dec 16 12:14:39.912 WARN failed to locate config-drive, using the metadata service API instead Dec 16 12:14:39.925912 coreos-metadata[1717]: Dec 16 12:14:39.925 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Dec 16 12:14:40.108704 coreos-metadata[1632]: Dec 16 12:14:40.108 INFO Fetch successful Dec 16 12:14:40.108704 coreos-metadata[1632]: Dec 16 12:14:40.108 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Dec 16 12:14:40.111528 coreos-metadata[1717]: Dec 16 12:14:40.111 INFO Fetch successful Dec 16 12:14:40.111528 coreos-metadata[1717]: Dec 16 12:14:40.111 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Dec 16 12:14:40.344022 coreos-metadata[1717]: Dec 16 12:14:40.343 INFO Fetch successful Dec 16 12:14:40.346147 unknown[1717]: wrote ssh authorized keys file for user: core Dec 16 12:14:40.346571 coreos-metadata[1632]: Dec 16 12:14:40.346 INFO Fetch successful Dec 16 12:14:40.346571 coreos-metadata[1632]: Dec 16 12:14:40.346 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Dec 16 12:14:40.382524 update-ssh-keys[1795]: Updated "/home/core/.ssh/authorized_keys" Dec 16 12:14:40.384214 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Dec 16 12:14:40.387429 systemd[1]: Finished sshkeys.service. Dec 16 12:14:40.472667 coreos-metadata[1632]: Dec 16 12:14:40.472 INFO Fetch successful Dec 16 12:14:40.472667 coreos-metadata[1632]: Dec 16 12:14:40.472 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Dec 16 12:14:40.594678 coreos-metadata[1632]: Dec 16 12:14:40.594 INFO Fetch successful Dec 16 12:14:40.594678 coreos-metadata[1632]: Dec 16 12:14:40.594 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Dec 16 12:14:40.716827 coreos-metadata[1632]: Dec 16 12:14:40.716 INFO Fetch successful Dec 16 12:14:40.740819 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Dec 16 12:14:40.741307 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Dec 16 12:14:40.741444 systemd[1]: Reached target multi-user.target - Multi-User System. Dec 16 12:14:40.742221 systemd[1]: Startup finished in 2.504s (kernel) + 13.834s (initrd) + 10.855s (userspace) = 27.195s. Dec 16 12:14:42.838180 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Dec 16 12:14:42.839399 systemd[1]: Started sshd@0-10.0.21.180:22-139.178.68.195:59782.service - OpenSSH per-connection server daemon (139.178.68.195:59782). Dec 16 12:14:43.777548 sshd[1804]: Accepted publickey for core from 139.178.68.195 port 59782 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:14:43.780239 sshd-session[1804]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:14:43.786606 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Dec 16 12:14:43.787653 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Dec 16 12:14:43.792309 systemd-logind[1646]: New session 1 of user core. Dec 16 12:14:43.828137 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Dec 16 12:14:43.830376 systemd[1]: Starting user@500.service - User Manager for UID 500... Dec 16 12:14:43.849329 (systemd)[1810]: pam_unix(systemd-user:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:14:43.851612 systemd-logind[1646]: New session 2 of user core. Dec 16 12:14:43.968156 systemd[1810]: Queued start job for default target default.target. Dec 16 12:14:43.988356 systemd[1810]: Created slice app.slice - User Application Slice. Dec 16 12:14:43.988388 systemd[1810]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Dec 16 12:14:43.988400 systemd[1810]: Reached target paths.target - Paths. Dec 16 12:14:43.988446 systemd[1810]: Reached target timers.target - Timers. Dec 16 12:14:43.989629 systemd[1810]: Starting dbus.socket - D-Bus User Message Bus Socket... Dec 16 12:14:43.990359 systemd[1810]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Dec 16 12:14:43.999745 systemd[1810]: Listening on dbus.socket - D-Bus User Message Bus Socket. Dec 16 12:14:44.000124 systemd[1810]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Dec 16 12:14:44.000891 systemd[1810]: Reached target sockets.target - Sockets. Dec 16 12:14:44.000934 systemd[1810]: Reached target basic.target - Basic System. Dec 16 12:14:44.000962 systemd[1810]: Reached target default.target - Main User Target. Dec 16 12:14:44.000986 systemd[1810]: Startup finished in 144ms. Dec 16 12:14:44.001374 systemd[1]: Started user@500.service - User Manager for UID 500. Dec 16 12:14:44.009397 systemd[1]: Started session-1.scope - Session 1 of User core. Dec 16 12:14:44.512710 systemd[1]: Started sshd@1-10.0.21.180:22-139.178.68.195:59790.service - OpenSSH per-connection server daemon (139.178.68.195:59790). Dec 16 12:14:45.153742 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Dec 16 12:14:45.155168 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:14:45.294750 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:14:45.298973 (kubelet)[1835]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:14:45.358781 sshd[1824]: Accepted publickey for core from 139.178.68.195 port 59790 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:14:45.360132 sshd-session[1824]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:14:45.364487 systemd-logind[1646]: New session 3 of user core. Dec 16 12:14:45.376458 systemd[1]: Started session-3.scope - Session 3 of User core. Dec 16 12:14:45.723354 kubelet[1835]: E1216 12:14:45.637932 1835 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:14:45.640816 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:14:45.640930 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:14:45.642198 systemd[1]: kubelet.service: Consumed 146ms CPU time, 107.9M memory peak. Dec 16 12:14:45.840303 sshd[1842]: Connection closed by 139.178.68.195 port 59790 Dec 16 12:14:45.840220 sshd-session[1824]: pam_unix(sshd:session): session closed for user core Dec 16 12:14:45.843849 systemd[1]: sshd@1-10.0.21.180:22-139.178.68.195:59790.service: Deactivated successfully. Dec 16 12:14:45.845356 systemd[1]: session-3.scope: Deactivated successfully. Dec 16 12:14:45.846781 systemd-logind[1646]: Session 3 logged out. Waiting for processes to exit. Dec 16 12:14:45.847758 systemd-logind[1646]: Removed session 3. Dec 16 12:14:46.020640 systemd[1]: Started sshd@2-10.0.21.180:22-139.178.68.195:59806.service - OpenSSH per-connection server daemon (139.178.68.195:59806). Dec 16 12:14:46.870988 sshd[1850]: Accepted publickey for core from 139.178.68.195 port 59806 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:14:46.872269 sshd-session[1850]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:14:46.878314 systemd-logind[1646]: New session 4 of user core. Dec 16 12:14:46.885452 systemd[1]: Started session-4.scope - Session 4 of User core. Dec 16 12:14:47.350982 sshd[1854]: Connection closed by 139.178.68.195 port 59806 Dec 16 12:14:47.351295 sshd-session[1850]: pam_unix(sshd:session): session closed for user core Dec 16 12:14:47.355165 systemd[1]: sshd@2-10.0.21.180:22-139.178.68.195:59806.service: Deactivated successfully. Dec 16 12:14:47.356710 systemd[1]: session-4.scope: Deactivated successfully. Dec 16 12:14:47.358793 systemd-logind[1646]: Session 4 logged out. Waiting for processes to exit. Dec 16 12:14:47.359736 systemd-logind[1646]: Removed session 4. Dec 16 12:14:47.546400 systemd[1]: Started sshd@3-10.0.21.180:22-139.178.68.195:59816.service - OpenSSH per-connection server daemon (139.178.68.195:59816). Dec 16 12:14:48.453434 sshd[1860]: Accepted publickey for core from 139.178.68.195 port 59816 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:14:48.455453 sshd-session[1860]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:14:48.462048 systemd-logind[1646]: New session 5 of user core. Dec 16 12:14:48.470278 systemd[1]: Started session-5.scope - Session 5 of User core. Dec 16 12:14:48.970021 sshd[1864]: Connection closed by 139.178.68.195 port 59816 Dec 16 12:14:48.970395 sshd-session[1860]: pam_unix(sshd:session): session closed for user core Dec 16 12:14:48.974032 systemd[1]: sshd@3-10.0.21.180:22-139.178.68.195:59816.service: Deactivated successfully. Dec 16 12:14:48.976633 systemd[1]: session-5.scope: Deactivated successfully. Dec 16 12:14:48.978652 systemd-logind[1646]: Session 5 logged out. Waiting for processes to exit. Dec 16 12:14:48.979524 systemd-logind[1646]: Removed session 5. Dec 16 12:14:49.147207 systemd[1]: Started sshd@4-10.0.21.180:22-139.178.68.195:59824.service - OpenSSH per-connection server daemon (139.178.68.195:59824). Dec 16 12:14:50.040183 sshd[1870]: Accepted publickey for core from 139.178.68.195 port 59824 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:14:50.041567 sshd-session[1870]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:14:50.048845 systemd-logind[1646]: New session 6 of user core. Dec 16 12:14:50.056275 systemd[1]: Started session-6.scope - Session 6 of User core. Dec 16 12:14:50.394544 sudo[1875]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Dec 16 12:14:50.394814 sudo[1875]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:14:50.407374 sudo[1875]: pam_unix(sudo:session): session closed for user root Dec 16 12:14:50.575167 sshd[1874]: Connection closed by 139.178.68.195 port 59824 Dec 16 12:14:50.575850 sshd-session[1870]: pam_unix(sshd:session): session closed for user core Dec 16 12:14:50.580261 systemd[1]: sshd@4-10.0.21.180:22-139.178.68.195:59824.service: Deactivated successfully. Dec 16 12:14:50.581960 systemd[1]: session-6.scope: Deactivated successfully. Dec 16 12:14:50.584761 systemd-logind[1646]: Session 6 logged out. Waiting for processes to exit. Dec 16 12:14:50.585832 systemd-logind[1646]: Removed session 6. Dec 16 12:14:50.750502 systemd[1]: Started sshd@5-10.0.21.180:22-139.178.68.195:57588.service - OpenSSH per-connection server daemon (139.178.68.195:57588). Dec 16 12:14:51.623112 sshd[1882]: Accepted publickey for core from 139.178.68.195 port 57588 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:14:51.624503 sshd-session[1882]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:14:51.628872 systemd-logind[1646]: New session 7 of user core. Dec 16 12:14:51.643503 systemd[1]: Started session-7.scope - Session 7 of User core. Dec 16 12:14:51.953855 sudo[1888]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Dec 16 12:14:51.954147 sudo[1888]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:14:51.956694 sudo[1888]: pam_unix(sudo:session): session closed for user root Dec 16 12:14:51.962769 sudo[1887]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Dec 16 12:14:51.963036 sudo[1887]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:14:51.969883 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 12:14:52.013000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Dec 16 12:14:52.013564 augenrules[1912]: No rules Dec 16 12:14:52.015192 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 12:14:52.015298 kernel: kauditd_printk_skb: 188 callbacks suppressed Dec 16 12:14:52.015337 kernel: audit: type=1305 audit(1765887292.013:232): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Dec 16 12:14:52.015354 kernel: audit: type=1300 audit(1765887292.013:232): arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffdd83b340 a2=420 a3=0 items=0 ppid=1893 pid=1912 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:52.013000 audit[1912]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffdd83b340 a2=420 a3=0 items=0 ppid=1893 pid=1912 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:52.015446 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 12:14:52.018330 kernel: audit: type=1327 audit(1765887292.013:232): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Dec 16 12:14:52.013000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Dec 16 12:14:52.017644 sudo[1887]: pam_unix(sudo:session): session closed for user root Dec 16 12:14:52.015000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:52.021764 kernel: audit: type=1130 audit(1765887292.015:233): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:52.021803 kernel: audit: type=1131 audit(1765887292.015:234): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:52.015000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:52.017000 audit[1887]: USER_END pid=1887 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:14:52.026153 kernel: audit: type=1106 audit(1765887292.017:235): pid=1887 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:14:52.026188 kernel: audit: type=1104 audit(1765887292.017:236): pid=1887 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:14:52.017000 audit[1887]: CRED_DISP pid=1887 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:14:52.180755 sshd[1886]: Connection closed by 139.178.68.195 port 57588 Dec 16 12:14:52.181109 sshd-session[1882]: pam_unix(sshd:session): session closed for user core Dec 16 12:14:52.182000 audit[1882]: USER_END pid=1882 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:52.185253 systemd[1]: sshd@5-10.0.21.180:22-139.178.68.195:57588.service: Deactivated successfully. Dec 16 12:14:52.182000 audit[1882]: CRED_DISP pid=1882 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:52.186961 systemd[1]: session-7.scope: Deactivated successfully. Dec 16 12:14:52.187809 systemd-logind[1646]: Session 7 logged out. Waiting for processes to exit. Dec 16 12:14:52.188618 kernel: audit: type=1106 audit(1765887292.182:237): pid=1882 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:52.188675 kernel: audit: type=1104 audit(1765887292.182:238): pid=1882 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:52.188702 kernel: audit: type=1131 audit(1765887292.185:239): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-10.0.21.180:22-139.178.68.195:57588 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:52.185000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-10.0.21.180:22-139.178.68.195:57588 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:52.188962 systemd-logind[1646]: Removed session 7. Dec 16 12:14:52.354000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-10.0.21.180:22-139.178.68.195:57592 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:52.354335 systemd[1]: Started sshd@6-10.0.21.180:22-139.178.68.195:57592.service - OpenSSH per-connection server daemon (139.178.68.195:57592). Dec 16 12:14:53.204000 audit[1921]: USER_ACCT pid=1921 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:53.204466 sshd[1921]: Accepted publickey for core from 139.178.68.195 port 57592 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:14:53.205000 audit[1921]: CRED_ACQ pid=1921 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:53.205000 audit[1921]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe55050a0 a2=3 a3=0 items=0 ppid=1 pid=1921 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=8 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:53.205000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:14:53.205840 sshd-session[1921]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:14:53.210425 systemd-logind[1646]: New session 8 of user core. Dec 16 12:14:53.219295 systemd[1]: Started session-8.scope - Session 8 of User core. Dec 16 12:14:53.222000 audit[1921]: USER_START pid=1921 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:53.224000 audit[1925]: CRED_ACQ pid=1925 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:53.535000 audit[1926]: USER_ACCT pid=1926 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:14:53.535000 audit[1926]: CRED_REFR pid=1926 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:14:53.535000 audit[1926]: USER_START pid=1926 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:14:53.535433 sudo[1926]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Dec 16 12:14:53.535687 sudo[1926]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:14:53.837165 systemd[1]: Starting docker.service - Docker Application Container Engine... Dec 16 12:14:53.853774 (dockerd)[1948]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Dec 16 12:14:54.081549 dockerd[1948]: time="2025-12-16T12:14:54.081487000Z" level=info msg="Starting up" Dec 16 12:14:54.083635 dockerd[1948]: time="2025-12-16T12:14:54.083612160Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Dec 16 12:14:54.094484 dockerd[1948]: time="2025-12-16T12:14:54.094246720Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Dec 16 12:14:54.109180 systemd[1]: var-lib-docker-check\x2doverlayfs\x2dsupport1270202391-merged.mount: Deactivated successfully. Dec 16 12:14:54.132923 dockerd[1948]: time="2025-12-16T12:14:54.132700160Z" level=info msg="Loading containers: start." Dec 16 12:14:54.144109 kernel: Initializing XFRM netlink socket Dec 16 12:14:54.195000 audit[1999]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=1999 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:14:54.195000 audit[1999]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=fffff66d58b0 a2=0 a3=0 items=0 ppid=1948 pid=1999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.195000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Dec 16 12:14:54.196000 audit[2001]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=2001 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:14:54.196000 audit[2001]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=ffffcc86db50 a2=0 a3=0 items=0 ppid=1948 pid=2001 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.196000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Dec 16 12:14:54.198000 audit[2003]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=2003 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:14:54.198000 audit[2003]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff34ca8f0 a2=0 a3=0 items=0 ppid=1948 pid=2003 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.198000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Dec 16 12:14:54.200000 audit[2005]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=2005 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:14:54.200000 audit[2005]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffdc4bda00 a2=0 a3=0 items=0 ppid=1948 pid=2005 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.200000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Dec 16 12:14:54.202000 audit[2007]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=2007 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:14:54.202000 audit[2007]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=fffffac4fbc0 a2=0 a3=0 items=0 ppid=1948 pid=2007 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.202000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Dec 16 12:14:54.204000 audit[2009]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=2009 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:14:54.204000 audit[2009]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffeee91700 a2=0 a3=0 items=0 ppid=1948 pid=2009 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.204000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 12:14:54.206000 audit[2011]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=2011 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:14:54.206000 audit[2011]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffea4e2e00 a2=0 a3=0 items=0 ppid=1948 pid=2011 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.206000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Dec 16 12:14:54.208000 audit[2013]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=2013 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:14:54.208000 audit[2013]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=fffff0e9b280 a2=0 a3=0 items=0 ppid=1948 pid=2013 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.208000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Dec 16 12:14:54.235000 audit[2016]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=2016 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:14:54.235000 audit[2016]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=472 a0=3 a1=fffff33b9930 a2=0 a3=0 items=0 ppid=1948 pid=2016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.235000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Dec 16 12:14:54.237000 audit[2018]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=2018 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:14:54.237000 audit[2018]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffcbb36b40 a2=0 a3=0 items=0 ppid=1948 pid=2018 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.237000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Dec 16 12:14:54.239000 audit[2020]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=2020 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:14:54.239000 audit[2020]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=fffff6ef5490 a2=0 a3=0 items=0 ppid=1948 pid=2020 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.239000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Dec 16 12:14:54.241000 audit[2022]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=2022 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:14:54.241000 audit[2022]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=ffffdb7e2960 a2=0 a3=0 items=0 ppid=1948 pid=2022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.241000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 12:14:54.243000 audit[2024]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=2024 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:14:54.243000 audit[2024]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=ffffcfc73f50 a2=0 a3=0 items=0 ppid=1948 pid=2024 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.243000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Dec 16 12:14:54.280000 audit[2054]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2054 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:14:54.280000 audit[2054]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=ffffdeb74fd0 a2=0 a3=0 items=0 ppid=1948 pid=2054 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.280000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Dec 16 12:14:54.282000 audit[2056]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2056 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:14:54.282000 audit[2056]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=fffff3b72400 a2=0 a3=0 items=0 ppid=1948 pid=2056 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.282000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Dec 16 12:14:54.284000 audit[2058]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2058 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:14:54.284000 audit[2058]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffed9d98a0 a2=0 a3=0 items=0 ppid=1948 pid=2058 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.284000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Dec 16 12:14:54.286000 audit[2060]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2060 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:14:54.286000 audit[2060]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe838bdf0 a2=0 a3=0 items=0 ppid=1948 pid=2060 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.286000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Dec 16 12:14:54.288000 audit[2062]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2062 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:14:54.288000 audit[2062]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffec8e34e0 a2=0 a3=0 items=0 ppid=1948 pid=2062 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.288000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Dec 16 12:14:54.290000 audit[2064]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2064 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:14:54.290000 audit[2064]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffdbc84190 a2=0 a3=0 items=0 ppid=1948 pid=2064 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.290000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 12:14:54.292000 audit[2066]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2066 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:14:54.292000 audit[2066]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=fffff8174cd0 a2=0 a3=0 items=0 ppid=1948 pid=2066 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.292000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Dec 16 12:14:54.294000 audit[2068]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2068 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:14:54.294000 audit[2068]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=ffffef645d60 a2=0 a3=0 items=0 ppid=1948 pid=2068 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.294000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Dec 16 12:14:54.296000 audit[2070]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2070 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:14:54.296000 audit[2070]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=484 a0=3 a1=ffffdd47e9b0 a2=0 a3=0 items=0 ppid=1948 pid=2070 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.296000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Dec 16 12:14:54.298000 audit[2072]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2072 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:14:54.298000 audit[2072]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffdaed4110 a2=0 a3=0 items=0 ppid=1948 pid=2072 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.298000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Dec 16 12:14:54.300000 audit[2074]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2074 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:14:54.300000 audit[2074]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=ffffc7770790 a2=0 a3=0 items=0 ppid=1948 pid=2074 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.300000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Dec 16 12:14:54.302000 audit[2076]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2076 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:14:54.302000 audit[2076]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=ffffee55a080 a2=0 a3=0 items=0 ppid=1948 pid=2076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.302000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 12:14:54.304000 audit[2078]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2078 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:14:54.304000 audit[2078]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=fffffabe3bb0 a2=0 a3=0 items=0 ppid=1948 pid=2078 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.304000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Dec 16 12:14:54.309000 audit[2083]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2083 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:14:54.309000 audit[2083]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=fffff24d00b0 a2=0 a3=0 items=0 ppid=1948 pid=2083 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.309000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Dec 16 12:14:54.311000 audit[2085]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2085 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:14:54.311000 audit[2085]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=ffffe2476d50 a2=0 a3=0 items=0 ppid=1948 pid=2085 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.311000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Dec 16 12:14:54.313000 audit[2087]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2087 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:14:54.313000 audit[2087]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffd32711b0 a2=0 a3=0 items=0 ppid=1948 pid=2087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.313000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Dec 16 12:14:54.315000 audit[2089]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2089 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:14:54.315000 audit[2089]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=fffff6d20df0 a2=0 a3=0 items=0 ppid=1948 pid=2089 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.315000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Dec 16 12:14:54.317000 audit[2091]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2091 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:14:54.317000 audit[2091]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=ffffcbbdbda0 a2=0 a3=0 items=0 ppid=1948 pid=2091 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.317000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Dec 16 12:14:54.319000 audit[2093]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2093 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:14:54.319000 audit[2093]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=fffff5457f60 a2=0 a3=0 items=0 ppid=1948 pid=2093 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.319000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Dec 16 12:14:54.349000 audit[2098]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2098 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:14:54.349000 audit[2098]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=520 a0=3 a1=ffffe943b7c0 a2=0 a3=0 items=0 ppid=1948 pid=2098 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.349000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Dec 16 12:14:54.352000 audit[2100]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2100 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:14:54.352000 audit[2100]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=288 a0=3 a1=ffffea79a5c0 a2=0 a3=0 items=0 ppid=1948 pid=2100 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.352000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Dec 16 12:14:54.360000 audit[2108]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2108 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:14:54.360000 audit[2108]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=300 a0=3 a1=ffffc8b4b750 a2=0 a3=0 items=0 ppid=1948 pid=2108 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.360000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Dec 16 12:14:54.374000 audit[2114]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2114 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:14:54.374000 audit[2114]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=376 a0=3 a1=ffffc5e20750 a2=0 a3=0 items=0 ppid=1948 pid=2114 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.374000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Dec 16 12:14:54.376000 audit[2116]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2116 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:14:54.376000 audit[2116]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=512 a0=3 a1=ffffe9a5b950 a2=0 a3=0 items=0 ppid=1948 pid=2116 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.376000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Dec 16 12:14:54.378000 audit[2118]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2118 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:14:54.378000 audit[2118]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffe82e4d20 a2=0 a3=0 items=0 ppid=1948 pid=2118 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.378000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Dec 16 12:14:54.380000 audit[2120]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2120 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:14:54.380000 audit[2120]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=428 a0=3 a1=fffff3103120 a2=0 a3=0 items=0 ppid=1948 pid=2120 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.380000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Dec 16 12:14:54.382000 audit[2122]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2122 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:14:54.382000 audit[2122]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffeacb9ca0 a2=0 a3=0 items=0 ppid=1948 pid=2122 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:54.382000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Dec 16 12:14:54.383837 systemd-networkd[1582]: docker0: Link UP Dec 16 12:14:54.388919 dockerd[1948]: time="2025-12-16T12:14:54.388867080Z" level=info msg="Loading containers: done." Dec 16 12:14:54.410492 dockerd[1948]: time="2025-12-16T12:14:54.410431360Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Dec 16 12:14:54.410701 dockerd[1948]: time="2025-12-16T12:14:54.410543040Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Dec 16 12:14:54.410767 dockerd[1948]: time="2025-12-16T12:14:54.410743480Z" level=info msg="Initializing buildkit" Dec 16 12:14:54.434290 dockerd[1948]: time="2025-12-16T12:14:54.434238800Z" level=info msg="Completed buildkit initialization" Dec 16 12:14:54.440836 dockerd[1948]: time="2025-12-16T12:14:54.440780680Z" level=info msg="Daemon has completed initialization" Dec 16 12:14:54.441159 systemd[1]: Started docker.service - Docker Application Container Engine. Dec 16 12:14:54.441000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:54.441680 dockerd[1948]: time="2025-12-16T12:14:54.440857360Z" level=info msg="API listen on /run/docker.sock" Dec 16 12:14:55.107440 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3151103016-merged.mount: Deactivated successfully. Dec 16 12:14:55.571861 containerd[1680]: time="2025-12-16T12:14:55.571800400Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.10\"" Dec 16 12:14:55.653631 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Dec 16 12:14:55.654967 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:14:55.795056 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:14:55.795000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:55.798807 (kubelet)[2172]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:14:55.836085 kubelet[2172]: E1216 12:14:55.835919 2172 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:14:55.838160 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:14:55.838287 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:14:55.840000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 12:14:55.840249 systemd[1]: kubelet.service: Consumed 137ms CPU time, 105.8M memory peak. Dec 16 12:14:56.174575 chronyd[1630]: Selected source PHC0 Dec 16 12:14:56.943395 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1780306756.mount: Deactivated successfully. Dec 16 12:14:57.821560 containerd[1680]: time="2025-12-16T12:14:57.821395931Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:14:57.822910 containerd[1680]: time="2025-12-16T12:14:57.822846582Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.32.10: active requests=0, bytes read=24835766" Dec 16 12:14:57.823848 containerd[1680]: time="2025-12-16T12:14:57.823825844Z" level=info msg="ImageCreate event name:\"sha256:03aec5fd5841efdd990b8fe285e036fc1386e2f8851378ce2c9dfd1b331897ea\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:14:57.827169 containerd[1680]: time="2025-12-16T12:14:57.827126301Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:af4ee57c047e31a7f58422b94a9ec4c62221d3deebb16755bdeff720df796189\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:14:57.828126 containerd[1680]: time="2025-12-16T12:14:57.828091115Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.32.10\" with image id \"sha256:03aec5fd5841efdd990b8fe285e036fc1386e2f8851378ce2c9dfd1b331897ea\", repo tag \"registry.k8s.io/kube-apiserver:v1.32.10\", repo digest \"registry.k8s.io/kube-apiserver@sha256:af4ee57c047e31a7f58422b94a9ec4c62221d3deebb16755bdeff720df796189\", size \"26428558\" in 2.256249275s" Dec 16 12:14:57.828223 containerd[1680]: time="2025-12-16T12:14:57.828207863Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.32.10\" returns image reference \"sha256:03aec5fd5841efdd990b8fe285e036fc1386e2f8851378ce2c9dfd1b331897ea\"" Dec 16 12:14:57.828806 containerd[1680]: time="2025-12-16T12:14:57.828778436Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.10\"" Dec 16 12:14:58.998334 containerd[1680]: time="2025-12-16T12:14:58.998284132Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:14:59.000927 containerd[1680]: time="2025-12-16T12:14:59.000863356Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.32.10: active requests=0, bytes read=22610801" Dec 16 12:14:59.002087 containerd[1680]: time="2025-12-16T12:14:59.002042047Z" level=info msg="ImageCreate event name:\"sha256:66490a6490dde2df4a78eba21320da67070ad88461899536880edb5301ec2ba3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:14:59.005005 containerd[1680]: time="2025-12-16T12:14:59.004973155Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:efbd9d1dfcd2940e1c73a1476c880c3c2cdf04cc60722d329b21cd48745c8660\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:14:59.006828 containerd[1680]: time="2025-12-16T12:14:59.006784012Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.32.10\" with image id \"sha256:66490a6490dde2df4a78eba21320da67070ad88461899536880edb5301ec2ba3\", repo tag \"registry.k8s.io/kube-controller-manager:v1.32.10\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:efbd9d1dfcd2940e1c73a1476c880c3c2cdf04cc60722d329b21cd48745c8660\", size \"24203439\" in 1.177974985s" Dec 16 12:14:59.006828 containerd[1680]: time="2025-12-16T12:14:59.006826572Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.32.10\" returns image reference \"sha256:66490a6490dde2df4a78eba21320da67070ad88461899536880edb5301ec2ba3\"" Dec 16 12:14:59.007338 containerd[1680]: time="2025-12-16T12:14:59.007310897Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.10\"" Dec 16 12:15:00.205345 containerd[1680]: time="2025-12-16T12:15:00.205274074Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:15:00.206494 containerd[1680]: time="2025-12-16T12:15:00.206442920Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.32.10: active requests=0, bytes read=17610300" Dec 16 12:15:00.207793 containerd[1680]: time="2025-12-16T12:15:00.207736846Z" level=info msg="ImageCreate event name:\"sha256:fcf368a1abd0b48cff2fd3cca12fcc008aaf52eeab885656f11e7773c6a188a3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:15:00.210317 containerd[1680]: time="2025-12-16T12:15:00.210275299Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:9c58e1adcad5af66d1d9ca5cf9a4c266e4054b8f19f91a8fff1993549e657b10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:15:00.211402 containerd[1680]: time="2025-12-16T12:15:00.211351305Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.32.10\" with image id \"sha256:fcf368a1abd0b48cff2fd3cca12fcc008aaf52eeab885656f11e7773c6a188a3\", repo tag \"registry.k8s.io/kube-scheduler:v1.32.10\", repo digest \"registry.k8s.io/kube-scheduler@sha256:9c58e1adcad5af66d1d9ca5cf9a4c266e4054b8f19f91a8fff1993549e657b10\", size \"19202938\" in 1.203986288s" Dec 16 12:15:00.211402 containerd[1680]: time="2025-12-16T12:15:00.211385305Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.32.10\" returns image reference \"sha256:fcf368a1abd0b48cff2fd3cca12fcc008aaf52eeab885656f11e7773c6a188a3\"" Dec 16 12:15:00.211823 containerd[1680]: time="2025-12-16T12:15:00.211788547Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.10\"" Dec 16 12:15:01.206915 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount23714640.mount: Deactivated successfully. Dec 16 12:15:01.555200 containerd[1680]: time="2025-12-16T12:15:01.555052797Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.32.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:15:01.556385 containerd[1680]: time="2025-12-16T12:15:01.556327924Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.32.10: active requests=0, bytes read=17716804" Dec 16 12:15:01.557471 containerd[1680]: time="2025-12-16T12:15:01.557427569Z" level=info msg="ImageCreate event name:\"sha256:8b57c1f8bd2ddfa793889457b41e87132f192046e262b32ab0514f32d28be47d\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:15:01.559788 containerd[1680]: time="2025-12-16T12:15:01.559715861Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:e3dda1c7b384f9eb5b2fa1c27493b23b80e6204b9fa2ee8791b2de078f468cbf\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:15:01.560451 containerd[1680]: time="2025-12-16T12:15:01.560229744Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.32.10\" with image id \"sha256:8b57c1f8bd2ddfa793889457b41e87132f192046e262b32ab0514f32d28be47d\", repo tag \"registry.k8s.io/kube-proxy:v1.32.10\", repo digest \"registry.k8s.io/kube-proxy@sha256:e3dda1c7b384f9eb5b2fa1c27493b23b80e6204b9fa2ee8791b2de078f468cbf\", size \"27560818\" in 1.348402557s" Dec 16 12:15:01.560451 containerd[1680]: time="2025-12-16T12:15:01.560261984Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.32.10\" returns image reference \"sha256:8b57c1f8bd2ddfa793889457b41e87132f192046e262b32ab0514f32d28be47d\"" Dec 16 12:15:01.560753 containerd[1680]: time="2025-12-16T12:15:01.560682066Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\"" Dec 16 12:15:02.142728 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount211295351.mount: Deactivated successfully. Dec 16 12:15:02.778177 containerd[1680]: time="2025-12-16T12:15:02.778110195Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.11.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:15:02.779376 containerd[1680]: time="2025-12-16T12:15:02.779317721Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.11.3: active requests=0, bytes read=15956475" Dec 16 12:15:02.780473 containerd[1680]: time="2025-12-16T12:15:02.780414086Z" level=info msg="ImageCreate event name:\"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:15:02.782913 containerd[1680]: time="2025-12-16T12:15:02.782850939Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:15:02.784085 containerd[1680]: time="2025-12-16T12:15:02.784028065Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.11.3\" with image id \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\", repo tag \"registry.k8s.io/coredns/coredns:v1.11.3\", repo digest \"registry.k8s.io/coredns/coredns@sha256:9caabbf6238b189a65d0d6e6ac138de60d6a1c419e5a341fbbb7c78382559c6e\", size \"16948420\" in 1.223275559s" Dec 16 12:15:02.784085 containerd[1680]: time="2025-12-16T12:15:02.784065385Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.11.3\" returns image reference \"sha256:2f6c962e7b8311337352d9fdea917da2184d9919f4da7695bc2a6517cf392fe4\"" Dec 16 12:15:02.784771 containerd[1680]: time="2025-12-16T12:15:02.784564188Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Dec 16 12:15:03.315600 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3114651809.mount: Deactivated successfully. Dec 16 12:15:03.321942 containerd[1680]: time="2025-12-16T12:15:03.321882728Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:15:03.323610 containerd[1680]: time="2025-12-16T12:15:03.323559296Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Dec 16 12:15:03.324410 containerd[1680]: time="2025-12-16T12:15:03.324371421Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:15:03.326634 containerd[1680]: time="2025-12-16T12:15:03.326582472Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:15:03.327377 containerd[1680]: time="2025-12-16T12:15:03.327351516Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 542.755128ms" Dec 16 12:15:03.327480 containerd[1680]: time="2025-12-16T12:15:03.327457636Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Dec 16 12:15:03.328167 containerd[1680]: time="2025-12-16T12:15:03.328112120Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\"" Dec 16 12:15:04.012797 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount142180796.mount: Deactivated successfully. Dec 16 12:15:05.904657 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Dec 16 12:15:05.906068 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:15:06.050157 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:15:06.049000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:06.053419 kernel: kauditd_printk_skb: 134 callbacks suppressed Dec 16 12:15:06.053483 kernel: audit: type=1130 audit(1765887306.049:292): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:06.054123 (kubelet)[2370]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:15:06.108329 kubelet[2370]: E1216 12:15:06.108262 2370 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:15:06.110543 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:15:06.110679 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:15:06.112176 systemd[1]: kubelet.service: Consumed 143ms CPU time, 107.5M memory peak. Dec 16 12:15:06.111000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 12:15:06.115085 kernel: audit: type=1131 audit(1765887306.111:293): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 12:15:06.685949 containerd[1680]: time="2025-12-16T12:15:06.685845295Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.16-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:15:06.687967 containerd[1680]: time="2025-12-16T12:15:06.687918545Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.16-0: active requests=0, bytes read=66060366" Dec 16 12:15:06.688845 containerd[1680]: time="2025-12-16T12:15:06.688822750Z" level=info msg="ImageCreate event name:\"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:15:06.692252 containerd[1680]: time="2025-12-16T12:15:06.692224567Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:15:06.693408 containerd[1680]: time="2025-12-16T12:15:06.693377333Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.16-0\" with image id \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\", repo tag \"registry.k8s.io/etcd:3.5.16-0\", repo digest \"registry.k8s.io/etcd@sha256:c6a9d11cc5c04b114ccdef39a9265eeef818e3d02f5359be035ae784097fdec5\", size \"67941650\" in 3.365236813s" Dec 16 12:15:06.693586 containerd[1680]: time="2025-12-16T12:15:06.693488454Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.16-0\" returns image reference \"sha256:7fc9d4aa817aa6a3e549f3cd49d1f7b496407be979fc36dd5f356d59ce8c3a82\"" Dec 16 12:15:12.570291 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:15:12.570443 systemd[1]: kubelet.service: Consumed 143ms CPU time, 107.5M memory peak. Dec 16 12:15:12.569000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:12.572612 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:15:12.569000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:12.575223 kernel: audit: type=1130 audit(1765887312.569:294): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:12.575291 kernel: audit: type=1131 audit(1765887312.569:295): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:12.594409 systemd[1]: Reload requested from client PID 2409 ('systemctl') (unit session-8.scope)... Dec 16 12:15:12.594423 systemd[1]: Reloading... Dec 16 12:15:12.666206 zram_generator::config[2456]: No configuration found. Dec 16 12:15:12.851879 systemd[1]: Reloading finished in 257 ms. Dec 16 12:15:12.881000 audit: BPF prog-id=63 op=LOAD Dec 16 12:15:12.881000 audit: BPF prog-id=60 op=UNLOAD Dec 16 12:15:12.882000 audit: BPF prog-id=64 op=LOAD Dec 16 12:15:12.884463 kernel: audit: type=1334 audit(1765887312.881:296): prog-id=63 op=LOAD Dec 16 12:15:12.884547 kernel: audit: type=1334 audit(1765887312.881:297): prog-id=60 op=UNLOAD Dec 16 12:15:12.884565 kernel: audit: type=1334 audit(1765887312.882:298): prog-id=64 op=LOAD Dec 16 12:15:12.888000 audit: BPF prog-id=65 op=LOAD Dec 16 12:15:12.888000 audit: BPF prog-id=61 op=UNLOAD Dec 16 12:15:12.890685 kernel: audit: type=1334 audit(1765887312.888:299): prog-id=65 op=LOAD Dec 16 12:15:12.890721 kernel: audit: type=1334 audit(1765887312.888:300): prog-id=61 op=UNLOAD Dec 16 12:15:12.890741 kernel: audit: type=1334 audit(1765887312.888:301): prog-id=62 op=UNLOAD Dec 16 12:15:12.888000 audit: BPF prog-id=62 op=UNLOAD Dec 16 12:15:12.888000 audit: BPF prog-id=66 op=LOAD Dec 16 12:15:12.889000 audit: BPF prog-id=67 op=LOAD Dec 16 12:15:12.892614 kernel: audit: type=1334 audit(1765887312.888:302): prog-id=66 op=LOAD Dec 16 12:15:12.892647 kernel: audit: type=1334 audit(1765887312.889:303): prog-id=67 op=LOAD Dec 16 12:15:12.889000 audit: BPF prog-id=44 op=UNLOAD Dec 16 12:15:12.889000 audit: BPF prog-id=45 op=UNLOAD Dec 16 12:15:12.890000 audit: BPF prog-id=68 op=LOAD Dec 16 12:15:12.890000 audit: BPF prog-id=49 op=UNLOAD Dec 16 12:15:12.891000 audit: BPF prog-id=69 op=LOAD Dec 16 12:15:12.891000 audit: BPF prog-id=70 op=LOAD Dec 16 12:15:12.891000 audit: BPF prog-id=50 op=UNLOAD Dec 16 12:15:12.891000 audit: BPF prog-id=51 op=UNLOAD Dec 16 12:15:12.892000 audit: BPF prog-id=71 op=LOAD Dec 16 12:15:12.892000 audit: BPF prog-id=55 op=UNLOAD Dec 16 12:15:12.892000 audit: BPF prog-id=72 op=LOAD Dec 16 12:15:12.892000 audit: BPF prog-id=73 op=LOAD Dec 16 12:15:12.892000 audit: BPF prog-id=56 op=UNLOAD Dec 16 12:15:12.892000 audit: BPF prog-id=57 op=UNLOAD Dec 16 12:15:12.892000 audit: BPF prog-id=74 op=LOAD Dec 16 12:15:12.892000 audit: BPF prog-id=43 op=UNLOAD Dec 16 12:15:12.893000 audit: BPF prog-id=75 op=LOAD Dec 16 12:15:12.893000 audit: BPF prog-id=46 op=UNLOAD Dec 16 12:15:12.893000 audit: BPF prog-id=76 op=LOAD Dec 16 12:15:12.893000 audit: BPF prog-id=77 op=LOAD Dec 16 12:15:12.893000 audit: BPF prog-id=47 op=UNLOAD Dec 16 12:15:12.893000 audit: BPF prog-id=48 op=UNLOAD Dec 16 12:15:12.893000 audit: BPF prog-id=78 op=LOAD Dec 16 12:15:12.893000 audit: BPF prog-id=52 op=UNLOAD Dec 16 12:15:12.893000 audit: BPF prog-id=79 op=LOAD Dec 16 12:15:12.893000 audit: BPF prog-id=80 op=LOAD Dec 16 12:15:12.893000 audit: BPF prog-id=53 op=UNLOAD Dec 16 12:15:12.893000 audit: BPF prog-id=54 op=UNLOAD Dec 16 12:15:12.894000 audit: BPF prog-id=81 op=LOAD Dec 16 12:15:12.894000 audit: BPF prog-id=58 op=UNLOAD Dec 16 12:15:12.894000 audit: BPF prog-id=82 op=LOAD Dec 16 12:15:12.894000 audit: BPF prog-id=59 op=UNLOAD Dec 16 12:15:12.911528 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Dec 16 12:15:12.911601 systemd[1]: kubelet.service: Failed with result 'signal'. Dec 16 12:15:12.911000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 12:15:12.912860 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:15:12.912947 systemd[1]: kubelet.service: Consumed 91ms CPU time, 95.2M memory peak. Dec 16 12:15:12.914824 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:15:13.693848 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:15:13.693000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:13.698034 (kubelet)[2504]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 16 12:15:13.944710 kubelet[2504]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 12:15:13.944710 kubelet[2504]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 16 12:15:13.944710 kubelet[2504]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 12:15:13.945035 kubelet[2504]: I1216 12:15:13.944785 2504 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 12:15:15.189338 kubelet[2504]: I1216 12:15:15.189287 2504 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Dec 16 12:15:15.189338 kubelet[2504]: I1216 12:15:15.189324 2504 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 12:15:15.189669 kubelet[2504]: I1216 12:15:15.189595 2504 server.go:954] "Client rotation is on, will bootstrap in background" Dec 16 12:15:15.219276 kubelet[2504]: E1216 12:15:15.219226 2504 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://10.0.21.180:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.21.180:6443: connect: connection refused" logger="UnhandledError" Dec 16 12:15:15.220829 kubelet[2504]: I1216 12:15:15.220739 2504 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 12:15:15.227228 kubelet[2504]: I1216 12:15:15.227204 2504 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 12:15:15.230779 kubelet[2504]: I1216 12:15:15.230752 2504 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 16 12:15:15.231631 kubelet[2504]: I1216 12:15:15.231567 2504 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 12:15:15.231780 kubelet[2504]: I1216 12:15:15.231612 2504 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4547-0-0-0-5b424f63c8","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 12:15:15.231886 kubelet[2504]: I1216 12:15:15.231871 2504 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 12:15:15.231886 kubelet[2504]: I1216 12:15:15.231882 2504 container_manager_linux.go:304] "Creating device plugin manager" Dec 16 12:15:15.232123 kubelet[2504]: I1216 12:15:15.232108 2504 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:15:15.235370 kubelet[2504]: I1216 12:15:15.235337 2504 kubelet.go:446] "Attempting to sync node with API server" Dec 16 12:15:15.235370 kubelet[2504]: I1216 12:15:15.235370 2504 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 12:15:15.235470 kubelet[2504]: I1216 12:15:15.235394 2504 kubelet.go:352] "Adding apiserver pod source" Dec 16 12:15:15.235470 kubelet[2504]: I1216 12:15:15.235404 2504 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 12:15:15.239052 kubelet[2504]: I1216 12:15:15.239018 2504 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Dec 16 12:15:15.239657 kubelet[2504]: W1216 12:15:15.239564 2504 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://10.0.21.180:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 10.0.21.180:6443: connect: connection refused Dec 16 12:15:15.239657 kubelet[2504]: E1216 12:15:15.239632 2504 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://10.0.21.180:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.21.180:6443: connect: connection refused" logger="UnhandledError" Dec 16 12:15:15.239758 kubelet[2504]: W1216 12:15:15.239639 2504 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://10.0.21.180:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4547-0-0-0-5b424f63c8&limit=500&resourceVersion=0": dial tcp 10.0.21.180:6443: connect: connection refused Dec 16 12:15:15.239758 kubelet[2504]: I1216 12:15:15.239682 2504 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 16 12:15:15.239758 kubelet[2504]: E1216 12:15:15.239693 2504 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://10.0.21.180:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4547-0-0-0-5b424f63c8&limit=500&resourceVersion=0\": dial tcp 10.0.21.180:6443: connect: connection refused" logger="UnhandledError" Dec 16 12:15:15.239911 kubelet[2504]: W1216 12:15:15.239814 2504 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Dec 16 12:15:15.240860 kubelet[2504]: I1216 12:15:15.240829 2504 watchdog_linux.go:99] "Systemd watchdog is not enabled" Dec 16 12:15:15.240927 kubelet[2504]: I1216 12:15:15.240869 2504 server.go:1287] "Started kubelet" Dec 16 12:15:15.242116 kubelet[2504]: I1216 12:15:15.242089 2504 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 12:15:15.242989 kubelet[2504]: I1216 12:15:15.242585 2504 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 12:15:15.243630 kubelet[2504]: I1216 12:15:15.243535 2504 server.go:479] "Adding debug handlers to kubelet server" Dec 16 12:15:15.244000 audit[2517]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2517 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:15.244000 audit[2517]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffd4ca60e0 a2=0 a3=0 items=0 ppid=2504 pid=2517 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.244000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Dec 16 12:15:15.245579 kubelet[2504]: I1216 12:15:15.245279 2504 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 12:15:15.245579 kubelet[2504]: I1216 12:15:15.245554 2504 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 12:15:15.245794 kubelet[2504]: I1216 12:15:15.245765 2504 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 16 12:15:15.245896 kubelet[2504]: E1216 12:15:15.245619 2504 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.21.180:6443/api/v1/namespaces/default/events\": dial tcp 10.0.21.180:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4547-0-0-0-5b424f63c8.1881b1283a4c8626 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4547-0-0-0-5b424f63c8,UID:ci-4547-0-0-0-5b424f63c8,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4547-0-0-0-5b424f63c8,},FirstTimestamp:2025-12-16 12:15:15.240846886 +0000 UTC m=+1.539815723,LastTimestamp:2025-12-16 12:15:15.240846886 +0000 UTC m=+1.539815723,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4547-0-0-0-5b424f63c8,}" Dec 16 12:15:15.245000 audit[2518]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2518 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:15.245000 audit[2518]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe4c15420 a2=0 a3=0 items=0 ppid=2504 pid=2518 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.245000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Dec 16 12:15:15.247774 kubelet[2504]: E1216 12:15:15.247750 2504 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-0-0-0-5b424f63c8\" not found" Dec 16 12:15:15.247819 kubelet[2504]: I1216 12:15:15.247793 2504 volume_manager.go:297] "Starting Kubelet Volume Manager" Dec 16 12:15:15.247960 kubelet[2504]: I1216 12:15:15.247942 2504 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Dec 16 12:15:15.248021 kubelet[2504]: I1216 12:15:15.248002 2504 reconciler.go:26] "Reconciler: start to sync state" Dec 16 12:15:15.248837 kubelet[2504]: E1216 12:15:15.248792 2504 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.21.180:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-0-5b424f63c8?timeout=10s\": dial tcp 10.0.21.180:6443: connect: connection refused" interval="200ms" Dec 16 12:15:15.249780 kubelet[2504]: E1216 12:15:15.249758 2504 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 16 12:15:15.249000 audit[2520]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2520 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:15.249000 audit[2520]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffea7fe8a0 a2=0 a3=0 items=0 ppid=2504 pid=2520 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.249000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:15:15.250685 kubelet[2504]: I1216 12:15:15.250661 2504 factory.go:221] Registration of the containerd container factory successfully Dec 16 12:15:15.250685 kubelet[2504]: I1216 12:15:15.250676 2504 factory.go:221] Registration of the systemd container factory successfully Dec 16 12:15:15.251158 kubelet[2504]: I1216 12:15:15.250751 2504 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 16 12:15:15.251344 kubelet[2504]: W1216 12:15:15.251285 2504 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://10.0.21.180:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 10.0.21.180:6443: connect: connection refused Dec 16 12:15:15.251381 kubelet[2504]: E1216 12:15:15.251348 2504 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://10.0.21.180:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.21.180:6443: connect: connection refused" logger="UnhandledError" Dec 16 12:15:15.251000 audit[2522]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2522 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:15.251000 audit[2522]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=fffffd682630 a2=0 a3=0 items=0 ppid=2504 pid=2522 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.251000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:15:15.257000 audit[2527]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2527 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:15.257000 audit[2527]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=924 a0=3 a1=ffffde0e9920 a2=0 a3=0 items=0 ppid=2504 pid=2527 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.257000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Dec 16 12:15:15.259491 kubelet[2504]: I1216 12:15:15.259427 2504 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 16 12:15:15.259000 audit[2528]: NETFILTER_CFG table=mangle:47 family=10 entries=2 op=nft_register_chain pid=2528 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:15.259000 audit[2528]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffc6078b50 a2=0 a3=0 items=0 ppid=2504 pid=2528 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.259000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Dec 16 12:15:15.259000 audit[2529]: NETFILTER_CFG table=mangle:48 family=2 entries=1 op=nft_register_chain pid=2529 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:15.259000 audit[2529]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff3835980 a2=0 a3=0 items=0 ppid=2504 pid=2529 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.259000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Dec 16 12:15:15.261509 kubelet[2504]: I1216 12:15:15.261474 2504 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 16 12:15:15.261509 kubelet[2504]: I1216 12:15:15.261519 2504 status_manager.go:227] "Starting to sync pod status with apiserver" Dec 16 12:15:15.261509 kubelet[2504]: I1216 12:15:15.261542 2504 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 16 12:15:15.261509 kubelet[2504]: I1216 12:15:15.261550 2504 kubelet.go:2382] "Starting kubelet main sync loop" Dec 16 12:15:15.261694 kubelet[2504]: E1216 12:15:15.261594 2504 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 12:15:15.261000 audit[2533]: NETFILTER_CFG table=nat:49 family=2 entries=1 op=nft_register_chain pid=2533 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:15.261000 audit[2533]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffffaeee110 a2=0 a3=0 items=0 ppid=2504 pid=2533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.261000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Dec 16 12:15:15.261000 audit[2534]: NETFILTER_CFG table=mangle:50 family=10 entries=1 op=nft_register_chain pid=2534 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:15.261000 audit[2534]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffebd27cd0 a2=0 a3=0 items=0 ppid=2504 pid=2534 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.261000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Dec 16 12:15:15.263298 kubelet[2504]: W1216 12:15:15.263267 2504 reflector.go:569] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://10.0.21.180:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 10.0.21.180:6443: connect: connection refused Dec 16 12:15:15.263417 kubelet[2504]: E1216 12:15:15.263393 2504 reflector.go:166] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://10.0.21.180:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.21.180:6443: connect: connection refused" logger="UnhandledError" Dec 16 12:15:15.263000 audit[2535]: NETFILTER_CFG table=nat:51 family=10 entries=1 op=nft_register_chain pid=2535 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:15.263000 audit[2535]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe303a660 a2=0 a3=0 items=0 ppid=2504 pid=2535 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.263000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Dec 16 12:15:15.265027 kubelet[2504]: I1216 12:15:15.264998 2504 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 16 12:15:15.265502 kubelet[2504]: I1216 12:15:15.265133 2504 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 16 12:15:15.265502 kubelet[2504]: I1216 12:15:15.265155 2504 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:15:15.264000 audit[2537]: NETFILTER_CFG table=filter:52 family=10 entries=1 op=nft_register_chain pid=2537 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:15.264000 audit[2537]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc6092620 a2=0 a3=0 items=0 ppid=2504 pid=2537 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.264000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Dec 16 12:15:15.264000 audit[2536]: NETFILTER_CFG table=filter:53 family=2 entries=1 op=nft_register_chain pid=2536 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:15.264000 audit[2536]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffd2babd20 a2=0 a3=0 items=0 ppid=2504 pid=2536 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.264000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Dec 16 12:15:15.267892 kubelet[2504]: I1216 12:15:15.267865 2504 policy_none.go:49] "None policy: Start" Dec 16 12:15:15.267974 kubelet[2504]: I1216 12:15:15.267964 2504 memory_manager.go:186] "Starting memorymanager" policy="None" Dec 16 12:15:15.268046 kubelet[2504]: I1216 12:15:15.268037 2504 state_mem.go:35] "Initializing new in-memory state store" Dec 16 12:15:15.273366 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Dec 16 12:15:15.296401 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Dec 16 12:15:15.318365 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Dec 16 12:15:15.319666 kubelet[2504]: I1216 12:15:15.319644 2504 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 16 12:15:15.319853 kubelet[2504]: I1216 12:15:15.319837 2504 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 12:15:15.319889 kubelet[2504]: I1216 12:15:15.319854 2504 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 12:15:15.320218 kubelet[2504]: I1216 12:15:15.320201 2504 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 12:15:15.321566 kubelet[2504]: E1216 12:15:15.321535 2504 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 16 12:15:15.321614 kubelet[2504]: E1216 12:15:15.321589 2504 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4547-0-0-0-5b424f63c8\" not found" Dec 16 12:15:15.371848 systemd[1]: Created slice kubepods-burstable-podf06bec92f4e666224e51bdee5ebb29cd.slice - libcontainer container kubepods-burstable-podf06bec92f4e666224e51bdee5ebb29cd.slice. Dec 16 12:15:15.396225 kubelet[2504]: E1216 12:15:15.396187 2504 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-0-5b424f63c8\" not found" node="ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:15.400466 systemd[1]: Created slice kubepods-burstable-podc916ee27207567fde06f7b695280ebf9.slice - libcontainer container kubepods-burstable-podc916ee27207567fde06f7b695280ebf9.slice. Dec 16 12:15:15.402314 kubelet[2504]: E1216 12:15:15.402283 2504 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-0-5b424f63c8\" not found" node="ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:15.404360 systemd[1]: Created slice kubepods-burstable-pod2d1d31f11f480b9db1ea84eba4a0cccf.slice - libcontainer container kubepods-burstable-pod2d1d31f11f480b9db1ea84eba4a0cccf.slice. Dec 16 12:15:15.406279 kubelet[2504]: E1216 12:15:15.406251 2504 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-0-5b424f63c8\" not found" node="ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:15.421777 kubelet[2504]: I1216 12:15:15.421754 2504 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:15.422192 kubelet[2504]: E1216 12:15:15.422168 2504 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.21.180:6443/api/v1/nodes\": dial tcp 10.0.21.180:6443: connect: connection refused" node="ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:15.450309 kubelet[2504]: I1216 12:15:15.449592 2504 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f06bec92f4e666224e51bdee5ebb29cd-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4547-0-0-0-5b424f63c8\" (UID: \"f06bec92f4e666224e51bdee5ebb29cd\") " pod="kube-system/kube-apiserver-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:15.450309 kubelet[2504]: I1216 12:15:15.449628 2504 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c916ee27207567fde06f7b695280ebf9-ca-certs\") pod \"kube-controller-manager-ci-4547-0-0-0-5b424f63c8\" (UID: \"c916ee27207567fde06f7b695280ebf9\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:15.450309 kubelet[2504]: I1216 12:15:15.449650 2504 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/c916ee27207567fde06f7b695280ebf9-flexvolume-dir\") pod \"kube-controller-manager-ci-4547-0-0-0-5b424f63c8\" (UID: \"c916ee27207567fde06f7b695280ebf9\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:15.450309 kubelet[2504]: I1216 12:15:15.449665 2504 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c916ee27207567fde06f7b695280ebf9-k8s-certs\") pod \"kube-controller-manager-ci-4547-0-0-0-5b424f63c8\" (UID: \"c916ee27207567fde06f7b695280ebf9\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:15.450309 kubelet[2504]: I1216 12:15:15.449681 2504 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c916ee27207567fde06f7b695280ebf9-kubeconfig\") pod \"kube-controller-manager-ci-4547-0-0-0-5b424f63c8\" (UID: \"c916ee27207567fde06f7b695280ebf9\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:15.450501 kubelet[2504]: I1216 12:15:15.449694 2504 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f06bec92f4e666224e51bdee5ebb29cd-ca-certs\") pod \"kube-apiserver-ci-4547-0-0-0-5b424f63c8\" (UID: \"f06bec92f4e666224e51bdee5ebb29cd\") " pod="kube-system/kube-apiserver-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:15.450501 kubelet[2504]: I1216 12:15:15.449710 2504 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f06bec92f4e666224e51bdee5ebb29cd-k8s-certs\") pod \"kube-apiserver-ci-4547-0-0-0-5b424f63c8\" (UID: \"f06bec92f4e666224e51bdee5ebb29cd\") " pod="kube-system/kube-apiserver-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:15.450501 kubelet[2504]: I1216 12:15:15.449725 2504 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c916ee27207567fde06f7b695280ebf9-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4547-0-0-0-5b424f63c8\" (UID: \"c916ee27207567fde06f7b695280ebf9\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:15.450501 kubelet[2504]: I1216 12:15:15.449742 2504 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/2d1d31f11f480b9db1ea84eba4a0cccf-kubeconfig\") pod \"kube-scheduler-ci-4547-0-0-0-5b424f63c8\" (UID: \"2d1d31f11f480b9db1ea84eba4a0cccf\") " pod="kube-system/kube-scheduler-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:15.450501 kubelet[2504]: E1216 12:15:15.450388 2504 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.21.180:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-0-5b424f63c8?timeout=10s\": dial tcp 10.0.21.180:6443: connect: connection refused" interval="400ms" Dec 16 12:15:15.624169 kubelet[2504]: I1216 12:15:15.624142 2504 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:15.624492 kubelet[2504]: E1216 12:15:15.624467 2504 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.21.180:6443/api/v1/nodes\": dial tcp 10.0.21.180:6443: connect: connection refused" node="ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:15.698244 containerd[1680]: time="2025-12-16T12:15:15.698203859Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4547-0-0-0-5b424f63c8,Uid:f06bec92f4e666224e51bdee5ebb29cd,Namespace:kube-system,Attempt:0,}" Dec 16 12:15:15.704139 containerd[1680]: time="2025-12-16T12:15:15.703971848Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4547-0-0-0-5b424f63c8,Uid:c916ee27207567fde06f7b695280ebf9,Namespace:kube-system,Attempt:0,}" Dec 16 12:15:15.707066 containerd[1680]: time="2025-12-16T12:15:15.707029864Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4547-0-0-0-5b424f63c8,Uid:2d1d31f11f480b9db1ea84eba4a0cccf,Namespace:kube-system,Attempt:0,}" Dec 16 12:15:15.725953 containerd[1680]: time="2025-12-16T12:15:15.725905040Z" level=info msg="connecting to shim 089c9d8a89d21b1e42db10fdbf486b800a815b62404b451ee4f2f11387a385e4" address="unix:///run/containerd/s/446b11544b4a2e542ee744672b3bcd40e574d4f94b669f957c8cf70822dd8068" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:15:15.745973 containerd[1680]: time="2025-12-16T12:15:15.745933542Z" level=info msg="connecting to shim b75041d8b8aeb6494062d9b601023d0c6be9abefbd30f006b4bd2096ce9786dd" address="unix:///run/containerd/s/6709cd05166b606da57e33e60d9bee3af2a77f614e3426936b371827527b26fa" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:15:15.753866 containerd[1680]: time="2025-12-16T12:15:15.753792222Z" level=info msg="connecting to shim 2beb1286b819acf7a828d239636218ffcac03b1501450b56c2487ee12c124450" address="unix:///run/containerd/s/ef72796d5aeaa955ed3915aefdb00d0b2ae28953dbc3e58e0dae29faf0751789" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:15:15.757348 systemd[1]: Started cri-containerd-089c9d8a89d21b1e42db10fdbf486b800a815b62404b451ee4f2f11387a385e4.scope - libcontainer container 089c9d8a89d21b1e42db10fdbf486b800a815b62404b451ee4f2f11387a385e4. Dec 16 12:15:15.778597 systemd[1]: Started cri-containerd-b75041d8b8aeb6494062d9b601023d0c6be9abefbd30f006b4bd2096ce9786dd.scope - libcontainer container b75041d8b8aeb6494062d9b601023d0c6be9abefbd30f006b4bd2096ce9786dd. Dec 16 12:15:15.780000 audit: BPF prog-id=83 op=LOAD Dec 16 12:15:15.781000 audit: BPF prog-id=84 op=LOAD Dec 16 12:15:15.781000 audit[2558]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2547 pid=2558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.781000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038396339643861383964323162316534326462313066646266343836 Dec 16 12:15:15.781000 audit: BPF prog-id=84 op=UNLOAD Dec 16 12:15:15.781000 audit[2558]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2547 pid=2558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.781000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038396339643861383964323162316534326462313066646266343836 Dec 16 12:15:15.781000 audit: BPF prog-id=85 op=LOAD Dec 16 12:15:15.781000 audit[2558]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2547 pid=2558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.781000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038396339643861383964323162316534326462313066646266343836 Dec 16 12:15:15.781000 audit: BPF prog-id=86 op=LOAD Dec 16 12:15:15.781000 audit[2558]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2547 pid=2558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.781000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038396339643861383964323162316534326462313066646266343836 Dec 16 12:15:15.781000 audit: BPF prog-id=86 op=UNLOAD Dec 16 12:15:15.781000 audit[2558]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2547 pid=2558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.781000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038396339643861383964323162316534326462313066646266343836 Dec 16 12:15:15.781000 audit: BPF prog-id=85 op=UNLOAD Dec 16 12:15:15.781000 audit[2558]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2547 pid=2558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.781000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038396339643861383964323162316534326462313066646266343836 Dec 16 12:15:15.781000 audit: BPF prog-id=87 op=LOAD Dec 16 12:15:15.781000 audit[2558]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2547 pid=2558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.781000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038396339643861383964323162316534326462313066646266343836 Dec 16 12:15:15.783691 systemd[1]: Started cri-containerd-2beb1286b819acf7a828d239636218ffcac03b1501450b56c2487ee12c124450.scope - libcontainer container 2beb1286b819acf7a828d239636218ffcac03b1501450b56c2487ee12c124450. Dec 16 12:15:15.792000 audit: BPF prog-id=88 op=LOAD Dec 16 12:15:15.793000 audit: BPF prog-id=89 op=LOAD Dec 16 12:15:15.793000 audit[2603]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000128180 a2=98 a3=0 items=0 ppid=2577 pid=2603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.793000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6237353034316438623861656236343934303632643962363031303233 Dec 16 12:15:15.793000 audit: BPF prog-id=89 op=UNLOAD Dec 16 12:15:15.793000 audit[2603]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2577 pid=2603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.793000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6237353034316438623861656236343934303632643962363031303233 Dec 16 12:15:15.793000 audit: BPF prog-id=90 op=LOAD Dec 16 12:15:15.793000 audit[2603]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001283e8 a2=98 a3=0 items=0 ppid=2577 pid=2603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.793000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6237353034316438623861656236343934303632643962363031303233 Dec 16 12:15:15.793000 audit: BPF prog-id=91 op=LOAD Dec 16 12:15:15.793000 audit[2603]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000128168 a2=98 a3=0 items=0 ppid=2577 pid=2603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.793000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6237353034316438623861656236343934303632643962363031303233 Dec 16 12:15:15.793000 audit: BPF prog-id=91 op=UNLOAD Dec 16 12:15:15.793000 audit[2603]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2577 pid=2603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.793000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6237353034316438623861656236343934303632643962363031303233 Dec 16 12:15:15.793000 audit: BPF prog-id=90 op=UNLOAD Dec 16 12:15:15.793000 audit[2603]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2577 pid=2603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.793000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6237353034316438623861656236343934303632643962363031303233 Dec 16 12:15:15.793000 audit: BPF prog-id=92 op=LOAD Dec 16 12:15:15.793000 audit[2603]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000128648 a2=98 a3=0 items=0 ppid=2577 pid=2603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.793000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6237353034316438623861656236343934303632643962363031303233 Dec 16 12:15:15.795000 audit: BPF prog-id=93 op=LOAD Dec 16 12:15:15.795000 audit: BPF prog-id=94 op=LOAD Dec 16 12:15:15.795000 audit[2617]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=2596 pid=2617 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.795000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3262656231323836623831396163663761383238643233393633363231 Dec 16 12:15:15.796000 audit: BPF prog-id=94 op=UNLOAD Dec 16 12:15:15.796000 audit[2617]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2596 pid=2617 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.796000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3262656231323836623831396163663761383238643233393633363231 Dec 16 12:15:15.796000 audit: BPF prog-id=95 op=LOAD Dec 16 12:15:15.796000 audit[2617]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=2596 pid=2617 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.796000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3262656231323836623831396163663761383238643233393633363231 Dec 16 12:15:15.796000 audit: BPF prog-id=96 op=LOAD Dec 16 12:15:15.796000 audit[2617]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=2596 pid=2617 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.796000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3262656231323836623831396163663761383238643233393633363231 Dec 16 12:15:15.796000 audit: BPF prog-id=96 op=UNLOAD Dec 16 12:15:15.796000 audit[2617]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2596 pid=2617 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.796000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3262656231323836623831396163663761383238643233393633363231 Dec 16 12:15:15.797000 audit: BPF prog-id=95 op=UNLOAD Dec 16 12:15:15.797000 audit[2617]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2596 pid=2617 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.797000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3262656231323836623831396163663761383238643233393633363231 Dec 16 12:15:15.797000 audit: BPF prog-id=97 op=LOAD Dec 16 12:15:15.797000 audit[2617]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=2596 pid=2617 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.797000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3262656231323836623831396163663761383238643233393633363231 Dec 16 12:15:15.816385 containerd[1680]: time="2025-12-16T12:15:15.816323021Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4547-0-0-0-5b424f63c8,Uid:f06bec92f4e666224e51bdee5ebb29cd,Namespace:kube-system,Attempt:0,} returns sandbox id \"089c9d8a89d21b1e42db10fdbf486b800a815b62404b451ee4f2f11387a385e4\"" Dec 16 12:15:15.820667 containerd[1680]: time="2025-12-16T12:15:15.820624763Z" level=info msg="CreateContainer within sandbox \"089c9d8a89d21b1e42db10fdbf486b800a815b62404b451ee4f2f11387a385e4\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Dec 16 12:15:15.829097 containerd[1680]: time="2025-12-16T12:15:15.828873165Z" level=info msg="Container 2720d7d645ce559c70dcfcdcf6f21b603fd75f7925508c48aef2ea359595a8ea: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:15:15.829667 containerd[1680]: time="2025-12-16T12:15:15.829619009Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4547-0-0-0-5b424f63c8,Uid:2d1d31f11f480b9db1ea84eba4a0cccf,Namespace:kube-system,Attempt:0,} returns sandbox id \"2beb1286b819acf7a828d239636218ffcac03b1501450b56c2487ee12c124450\"" Dec 16 12:15:15.832523 containerd[1680]: time="2025-12-16T12:15:15.832485864Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4547-0-0-0-5b424f63c8,Uid:c916ee27207567fde06f7b695280ebf9,Namespace:kube-system,Attempt:0,} returns sandbox id \"b75041d8b8aeb6494062d9b601023d0c6be9abefbd30f006b4bd2096ce9786dd\"" Dec 16 12:15:15.832523 containerd[1680]: time="2025-12-16T12:15:15.832510104Z" level=info msg="CreateContainer within sandbox \"2beb1286b819acf7a828d239636218ffcac03b1501450b56c2487ee12c124450\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Dec 16 12:15:15.834528 containerd[1680]: time="2025-12-16T12:15:15.834245593Z" level=info msg="CreateContainer within sandbox \"b75041d8b8aeb6494062d9b601023d0c6be9abefbd30f006b4bd2096ce9786dd\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Dec 16 12:15:15.848812 containerd[1680]: time="2025-12-16T12:15:15.848766787Z" level=info msg="Container 38b102d742cb82f5aa2203bb460f99b2fb2845a493ea541e7f3b69bd0463a81d: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:15:15.851059 kubelet[2504]: E1216 12:15:15.851027 2504 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.21.180:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-0-5b424f63c8?timeout=10s\": dial tcp 10.0.21.180:6443: connect: connection refused" interval="800ms" Dec 16 12:15:15.857778 containerd[1680]: time="2025-12-16T12:15:15.857734952Z" level=info msg="Container 8c9b7bee1f4cd81264f37cbbccb08bb0f37766a189d348586ee855de89576a4a: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:15:15.860743 containerd[1680]: time="2025-12-16T12:15:15.860704808Z" level=info msg="CreateContainer within sandbox \"089c9d8a89d21b1e42db10fdbf486b800a815b62404b451ee4f2f11387a385e4\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"2720d7d645ce559c70dcfcdcf6f21b603fd75f7925508c48aef2ea359595a8ea\"" Dec 16 12:15:15.862092 containerd[1680]: time="2025-12-16T12:15:15.861496052Z" level=info msg="StartContainer for \"2720d7d645ce559c70dcfcdcf6f21b603fd75f7925508c48aef2ea359595a8ea\"" Dec 16 12:15:15.862642 containerd[1680]: time="2025-12-16T12:15:15.862619617Z" level=info msg="connecting to shim 2720d7d645ce559c70dcfcdcf6f21b603fd75f7925508c48aef2ea359595a8ea" address="unix:///run/containerd/s/446b11544b4a2e542ee744672b3bcd40e574d4f94b669f957c8cf70822dd8068" protocol=ttrpc version=3 Dec 16 12:15:15.871842 containerd[1680]: time="2025-12-16T12:15:15.871785144Z" level=info msg="CreateContainer within sandbox \"2beb1286b819acf7a828d239636218ffcac03b1501450b56c2487ee12c124450\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"38b102d742cb82f5aa2203bb460f99b2fb2845a493ea541e7f3b69bd0463a81d\"" Dec 16 12:15:15.872317 containerd[1680]: time="2025-12-16T12:15:15.872289387Z" level=info msg="StartContainer for \"38b102d742cb82f5aa2203bb460f99b2fb2845a493ea541e7f3b69bd0463a81d\"" Dec 16 12:15:15.872435 containerd[1680]: time="2025-12-16T12:15:15.872293907Z" level=info msg="CreateContainer within sandbox \"b75041d8b8aeb6494062d9b601023d0c6be9abefbd30f006b4bd2096ce9786dd\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"8c9b7bee1f4cd81264f37cbbccb08bb0f37766a189d348586ee855de89576a4a\"" Dec 16 12:15:15.872800 containerd[1680]: time="2025-12-16T12:15:15.872767909Z" level=info msg="StartContainer for \"8c9b7bee1f4cd81264f37cbbccb08bb0f37766a189d348586ee855de89576a4a\"" Dec 16 12:15:15.874047 containerd[1680]: time="2025-12-16T12:15:15.873842315Z" level=info msg="connecting to shim 8c9b7bee1f4cd81264f37cbbccb08bb0f37766a189d348586ee855de89576a4a" address="unix:///run/containerd/s/6709cd05166b606da57e33e60d9bee3af2a77f614e3426936b371827527b26fa" protocol=ttrpc version=3 Dec 16 12:15:15.874904 containerd[1680]: time="2025-12-16T12:15:15.874873840Z" level=info msg="connecting to shim 38b102d742cb82f5aa2203bb460f99b2fb2845a493ea541e7f3b69bd0463a81d" address="unix:///run/containerd/s/ef72796d5aeaa955ed3915aefdb00d0b2ae28953dbc3e58e0dae29faf0751789" protocol=ttrpc version=3 Dec 16 12:15:15.885349 systemd[1]: Started cri-containerd-2720d7d645ce559c70dcfcdcf6f21b603fd75f7925508c48aef2ea359595a8ea.scope - libcontainer container 2720d7d645ce559c70dcfcdcf6f21b603fd75f7925508c48aef2ea359595a8ea. Dec 16 12:15:15.908531 systemd[1]: Started cri-containerd-38b102d742cb82f5aa2203bb460f99b2fb2845a493ea541e7f3b69bd0463a81d.scope - libcontainer container 38b102d742cb82f5aa2203bb460f99b2fb2845a493ea541e7f3b69bd0463a81d. Dec 16 12:15:15.911380 systemd[1]: Started cri-containerd-8c9b7bee1f4cd81264f37cbbccb08bb0f37766a189d348586ee855de89576a4a.scope - libcontainer container 8c9b7bee1f4cd81264f37cbbccb08bb0f37766a189d348586ee855de89576a4a. Dec 16 12:15:15.913000 audit: BPF prog-id=98 op=LOAD Dec 16 12:15:15.915000 audit: BPF prog-id=99 op=LOAD Dec 16 12:15:15.915000 audit[2677]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2547 pid=2677 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.915000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3237323064376436343563653535396337306463666364636636663231 Dec 16 12:15:15.915000 audit: BPF prog-id=99 op=UNLOAD Dec 16 12:15:15.915000 audit[2677]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2547 pid=2677 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.915000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3237323064376436343563653535396337306463666364636636663231 Dec 16 12:15:15.915000 audit: BPF prog-id=100 op=LOAD Dec 16 12:15:15.915000 audit[2677]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2547 pid=2677 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.915000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3237323064376436343563653535396337306463666364636636663231 Dec 16 12:15:15.915000 audit: BPF prog-id=101 op=LOAD Dec 16 12:15:15.915000 audit[2677]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2547 pid=2677 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.915000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3237323064376436343563653535396337306463666364636636663231 Dec 16 12:15:15.915000 audit: BPF prog-id=101 op=UNLOAD Dec 16 12:15:15.915000 audit[2677]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2547 pid=2677 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.915000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3237323064376436343563653535396337306463666364636636663231 Dec 16 12:15:15.915000 audit: BPF prog-id=100 op=UNLOAD Dec 16 12:15:15.915000 audit[2677]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2547 pid=2677 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.915000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3237323064376436343563653535396337306463666364636636663231 Dec 16 12:15:15.915000 audit: BPF prog-id=102 op=LOAD Dec 16 12:15:15.915000 audit[2677]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2547 pid=2677 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.915000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3237323064376436343563653535396337306463666364636636663231 Dec 16 12:15:15.921000 audit: BPF prog-id=103 op=LOAD Dec 16 12:15:15.923000 audit: BPF prog-id=104 op=LOAD Dec 16 12:15:15.923000 audit[2690]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2596 pid=2690 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.923000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338623130326437343263623832663561613232303362623436306639 Dec 16 12:15:15.923000 audit: BPF prog-id=104 op=UNLOAD Dec 16 12:15:15.923000 audit[2690]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2596 pid=2690 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.923000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338623130326437343263623832663561613232303362623436306639 Dec 16 12:15:15.923000 audit: BPF prog-id=105 op=LOAD Dec 16 12:15:15.923000 audit[2690]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2596 pid=2690 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.923000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338623130326437343263623832663561613232303362623436306639 Dec 16 12:15:15.923000 audit: BPF prog-id=106 op=LOAD Dec 16 12:15:15.923000 audit[2690]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2596 pid=2690 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.923000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338623130326437343263623832663561613232303362623436306639 Dec 16 12:15:15.924000 audit: BPF prog-id=106 op=UNLOAD Dec 16 12:15:15.924000 audit[2690]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2596 pid=2690 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.924000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338623130326437343263623832663561613232303362623436306639 Dec 16 12:15:15.924000 audit: BPF prog-id=105 op=UNLOAD Dec 16 12:15:15.924000 audit[2690]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2596 pid=2690 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.924000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338623130326437343263623832663561613232303362623436306639 Dec 16 12:15:15.924000 audit: BPF prog-id=107 op=LOAD Dec 16 12:15:15.924000 audit[2690]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2596 pid=2690 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.924000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3338623130326437343263623832663561613232303362623436306639 Dec 16 12:15:15.926000 audit: BPF prog-id=108 op=LOAD Dec 16 12:15:15.927000 audit: BPF prog-id=109 op=LOAD Dec 16 12:15:15.927000 audit[2689]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2577 pid=2689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.927000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863396237626565316634636438313236346633376362626363623038 Dec 16 12:15:15.927000 audit: BPF prog-id=109 op=UNLOAD Dec 16 12:15:15.927000 audit[2689]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2577 pid=2689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.927000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863396237626565316634636438313236346633376362626363623038 Dec 16 12:15:15.927000 audit: BPF prog-id=110 op=LOAD Dec 16 12:15:15.927000 audit[2689]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2577 pid=2689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.927000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863396237626565316634636438313236346633376362626363623038 Dec 16 12:15:15.927000 audit: BPF prog-id=111 op=LOAD Dec 16 12:15:15.927000 audit[2689]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2577 pid=2689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.927000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863396237626565316634636438313236346633376362626363623038 Dec 16 12:15:15.928000 audit: BPF prog-id=111 op=UNLOAD Dec 16 12:15:15.928000 audit[2689]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2577 pid=2689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.928000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863396237626565316634636438313236346633376362626363623038 Dec 16 12:15:15.928000 audit: BPF prog-id=110 op=UNLOAD Dec 16 12:15:15.928000 audit[2689]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2577 pid=2689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.928000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863396237626565316634636438313236346633376362626363623038 Dec 16 12:15:15.928000 audit: BPF prog-id=112 op=LOAD Dec 16 12:15:15.928000 audit[2689]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2577 pid=2689 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:15.928000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863396237626565316634636438313236346633376362626363623038 Dec 16 12:15:15.966386 containerd[1680]: time="2025-12-16T12:15:15.966285746Z" level=info msg="StartContainer for \"2720d7d645ce559c70dcfcdcf6f21b603fd75f7925508c48aef2ea359595a8ea\" returns successfully" Dec 16 12:15:15.966749 containerd[1680]: time="2025-12-16T12:15:15.966334626Z" level=info msg="StartContainer for \"38b102d742cb82f5aa2203bb460f99b2fb2845a493ea541e7f3b69bd0463a81d\" returns successfully" Dec 16 12:15:15.971102 containerd[1680]: time="2025-12-16T12:15:15.971068330Z" level=info msg="StartContainer for \"8c9b7bee1f4cd81264f37cbbccb08bb0f37766a189d348586ee855de89576a4a\" returns successfully" Dec 16 12:15:16.027545 kubelet[2504]: I1216 12:15:16.027504 2504 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:16.028162 kubelet[2504]: E1216 12:15:16.028127 2504 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.21.180:6443/api/v1/nodes\": dial tcp 10.0.21.180:6443: connect: connection refused" node="ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:16.271435 kubelet[2504]: E1216 12:15:16.271343 2504 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-0-5b424f63c8\" not found" node="ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:16.273605 kubelet[2504]: E1216 12:15:16.273514 2504 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-0-5b424f63c8\" not found" node="ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:16.275531 kubelet[2504]: E1216 12:15:16.275512 2504 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-0-5b424f63c8\" not found" node="ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:16.830394 kubelet[2504]: I1216 12:15:16.830345 2504 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:17.281724 kubelet[2504]: E1216 12:15:17.281534 2504 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-0-5b424f63c8\" not found" node="ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:17.281724 kubelet[2504]: E1216 12:15:17.281630 2504 kubelet.go:3190] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-0-5b424f63c8\" not found" node="ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:17.548230 update_engine[1647]: I20251216 12:15:17.548100 1647 update_attempter.cc:509] Updating boot flags... Dec 16 12:15:17.929261 kubelet[2504]: E1216 12:15:17.929226 2504 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4547-0-0-0-5b424f63c8\" not found" node="ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:17.996135 kubelet[2504]: I1216 12:15:17.996068 2504 kubelet_node_status.go:78] "Successfully registered node" node="ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:18.037483 kubelet[2504]: E1216 12:15:18.037374 2504 event.go:359] "Server rejected event (will not retry!)" err="namespaces \"default\" not found" event="&Event{ObjectMeta:{ci-4547-0-0-0-5b424f63c8.1881b1283a4c8626 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4547-0-0-0-5b424f63c8,UID:ci-4547-0-0-0-5b424f63c8,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4547-0-0-0-5b424f63c8,},FirstTimestamp:2025-12-16 12:15:15.240846886 +0000 UTC m=+1.539815723,LastTimestamp:2025-12-16 12:15:15.240846886 +0000 UTC m=+1.539815723,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4547-0-0-0-5b424f63c8,}" Dec 16 12:15:18.049004 kubelet[2504]: I1216 12:15:18.048963 2504 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:18.066777 kubelet[2504]: E1216 12:15:18.066741 2504 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4547-0-0-0-5b424f63c8\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:18.066777 kubelet[2504]: I1216 12:15:18.066775 2504 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:18.073976 kubelet[2504]: E1216 12:15:18.073945 2504 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4547-0-0-0-5b424f63c8\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:18.074015 kubelet[2504]: I1216 12:15:18.073980 2504 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:18.080078 kubelet[2504]: E1216 12:15:18.079265 2504 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4547-0-0-0-5b424f63c8\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:18.237614 kubelet[2504]: I1216 12:15:18.237497 2504 apiserver.go:52] "Watching apiserver" Dec 16 12:15:18.248189 kubelet[2504]: I1216 12:15:18.248138 2504 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Dec 16 12:15:18.281377 kubelet[2504]: I1216 12:15:18.281350 2504 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:18.283493 kubelet[2504]: E1216 12:15:18.283461 2504 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4547-0-0-0-5b424f63c8\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:19.884936 systemd[1]: Reload requested from client PID 2793 ('systemctl') (unit session-8.scope)... Dec 16 12:15:19.884954 systemd[1]: Reloading... Dec 16 12:15:19.958106 zram_generator::config[2839]: No configuration found. Dec 16 12:15:20.165052 systemd[1]: Reloading finished in 279 ms. Dec 16 12:15:20.191636 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:15:20.212395 systemd[1]: kubelet.service: Deactivated successfully. Dec 16 12:15:20.212675 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:15:20.211000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:20.212746 systemd[1]: kubelet.service: Consumed 1.758s CPU time, 128.9M memory peak. Dec 16 12:15:20.214136 kernel: kauditd_printk_skb: 202 callbacks suppressed Dec 16 12:15:20.214195 kernel: audit: type=1131 audit(1765887320.211:398): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:20.214900 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:15:20.214000 audit: BPF prog-id=113 op=LOAD Dec 16 12:15:20.216450 kernel: audit: type=1334 audit(1765887320.214:399): prog-id=113 op=LOAD Dec 16 12:15:20.216500 kernel: audit: type=1334 audit(1765887320.214:400): prog-id=78 op=UNLOAD Dec 16 12:15:20.214000 audit: BPF prog-id=78 op=UNLOAD Dec 16 12:15:20.217112 kernel: audit: type=1334 audit(1765887320.215:401): prog-id=114 op=LOAD Dec 16 12:15:20.215000 audit: BPF prog-id=114 op=LOAD Dec 16 12:15:20.216000 audit: BPF prog-id=115 op=LOAD Dec 16 12:15:20.218390 kernel: audit: type=1334 audit(1765887320.216:402): prog-id=115 op=LOAD Dec 16 12:15:20.218436 kernel: audit: type=1334 audit(1765887320.216:403): prog-id=79 op=UNLOAD Dec 16 12:15:20.218455 kernel: audit: type=1334 audit(1765887320.216:404): prog-id=80 op=UNLOAD Dec 16 12:15:20.216000 audit: BPF prog-id=79 op=UNLOAD Dec 16 12:15:20.216000 audit: BPF prog-id=80 op=UNLOAD Dec 16 12:15:20.219095 kernel: audit: type=1334 audit(1765887320.217:405): prog-id=116 op=LOAD Dec 16 12:15:20.217000 audit: BPF prog-id=116 op=LOAD Dec 16 12:15:20.218000 audit: BPF prog-id=74 op=UNLOAD Dec 16 12:15:20.219000 audit: BPF prog-id=117 op=LOAD Dec 16 12:15:20.221744 kernel: audit: type=1334 audit(1765887320.218:406): prog-id=74 op=UNLOAD Dec 16 12:15:20.221780 kernel: audit: type=1334 audit(1765887320.219:407): prog-id=117 op=LOAD Dec 16 12:15:20.237000 audit: BPF prog-id=118 op=LOAD Dec 16 12:15:20.237000 audit: BPF prog-id=66 op=UNLOAD Dec 16 12:15:20.237000 audit: BPF prog-id=67 op=UNLOAD Dec 16 12:15:20.237000 audit: BPF prog-id=119 op=LOAD Dec 16 12:15:20.238000 audit: BPF prog-id=71 op=UNLOAD Dec 16 12:15:20.238000 audit: BPF prog-id=120 op=LOAD Dec 16 12:15:20.238000 audit: BPF prog-id=121 op=LOAD Dec 16 12:15:20.238000 audit: BPF prog-id=72 op=UNLOAD Dec 16 12:15:20.238000 audit: BPF prog-id=73 op=UNLOAD Dec 16 12:15:20.238000 audit: BPF prog-id=122 op=LOAD Dec 16 12:15:20.238000 audit: BPF prog-id=81 op=UNLOAD Dec 16 12:15:20.239000 audit: BPF prog-id=123 op=LOAD Dec 16 12:15:20.239000 audit: BPF prog-id=82 op=UNLOAD Dec 16 12:15:20.240000 audit: BPF prog-id=124 op=LOAD Dec 16 12:15:20.240000 audit: BPF prog-id=75 op=UNLOAD Dec 16 12:15:20.240000 audit: BPF prog-id=125 op=LOAD Dec 16 12:15:20.240000 audit: BPF prog-id=126 op=LOAD Dec 16 12:15:20.240000 audit: BPF prog-id=76 op=UNLOAD Dec 16 12:15:20.240000 audit: BPF prog-id=77 op=UNLOAD Dec 16 12:15:20.241000 audit: BPF prog-id=127 op=LOAD Dec 16 12:15:20.241000 audit: BPF prog-id=68 op=UNLOAD Dec 16 12:15:20.241000 audit: BPF prog-id=128 op=LOAD Dec 16 12:15:20.241000 audit: BPF prog-id=129 op=LOAD Dec 16 12:15:20.241000 audit: BPF prog-id=69 op=UNLOAD Dec 16 12:15:20.241000 audit: BPF prog-id=70 op=UNLOAD Dec 16 12:15:20.242000 audit: BPF prog-id=130 op=LOAD Dec 16 12:15:20.242000 audit: BPF prog-id=63 op=UNLOAD Dec 16 12:15:20.242000 audit: BPF prog-id=131 op=LOAD Dec 16 12:15:20.242000 audit: BPF prog-id=132 op=LOAD Dec 16 12:15:20.242000 audit: BPF prog-id=64 op=UNLOAD Dec 16 12:15:20.242000 audit: BPF prog-id=65 op=UNLOAD Dec 16 12:15:20.371276 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:15:20.370000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:20.377471 (kubelet)[2884]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 16 12:15:20.586166 kubelet[2884]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 12:15:20.586166 kubelet[2884]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 16 12:15:20.586166 kubelet[2884]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 12:15:20.586166 kubelet[2884]: I1216 12:15:20.586017 2884 server.go:215] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 12:15:20.594702 kubelet[2884]: I1216 12:15:20.594663 2884 server.go:520] "Kubelet version" kubeletVersion="v1.32.4" Dec 16 12:15:20.594702 kubelet[2884]: I1216 12:15:20.594694 2884 server.go:522] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 12:15:20.595139 kubelet[2884]: I1216 12:15:20.595120 2884 server.go:954] "Client rotation is on, will bootstrap in background" Dec 16 12:15:20.597207 kubelet[2884]: I1216 12:15:20.597184 2884 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Dec 16 12:15:20.599843 kubelet[2884]: I1216 12:15:20.599726 2884 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 12:15:20.603401 kubelet[2884]: I1216 12:15:20.603383 2884 server.go:1444] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 12:15:20.605977 kubelet[2884]: I1216 12:15:20.605952 2884 server.go:772] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 16 12:15:20.606277 kubelet[2884]: I1216 12:15:20.606244 2884 container_manager_linux.go:268] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 12:15:20.606509 kubelet[2884]: I1216 12:15:20.606345 2884 container_manager_linux.go:273] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4547-0-0-0-5b424f63c8","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 12:15:20.606643 kubelet[2884]: I1216 12:15:20.606629 2884 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 12:15:20.606698 kubelet[2884]: I1216 12:15:20.606689 2884 container_manager_linux.go:304] "Creating device plugin manager" Dec 16 12:15:20.606791 kubelet[2884]: I1216 12:15:20.606781 2884 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:15:20.607014 kubelet[2884]: I1216 12:15:20.606999 2884 kubelet.go:446] "Attempting to sync node with API server" Dec 16 12:15:20.607111 kubelet[2884]: I1216 12:15:20.607094 2884 kubelet.go:341] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 12:15:20.607197 kubelet[2884]: I1216 12:15:20.607185 2884 kubelet.go:352] "Adding apiserver pod source" Dec 16 12:15:20.607323 kubelet[2884]: I1216 12:15:20.607313 2884 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 12:15:20.608181 kubelet[2884]: I1216 12:15:20.608138 2884 kuberuntime_manager.go:269] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Dec 16 12:15:20.609128 kubelet[2884]: I1216 12:15:20.608592 2884 kubelet.go:890] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Dec 16 12:15:20.609366 kubelet[2884]: I1216 12:15:20.609342 2884 watchdog_linux.go:99] "Systemd watchdog is not enabled" Dec 16 12:15:20.609913 kubelet[2884]: I1216 12:15:20.609899 2884 server.go:1287] "Started kubelet" Dec 16 12:15:20.612126 kubelet[2884]: I1216 12:15:20.612012 2884 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 12:15:20.612396 kubelet[2884]: I1216 12:15:20.612363 2884 server.go:243] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 12:15:20.613278 kubelet[2884]: I1216 12:15:20.613237 2884 server.go:169] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 12:15:20.615364 kubelet[2884]: I1216 12:15:20.615325 2884 server.go:479] "Adding debug handlers to kubelet server" Dec 16 12:15:20.621967 kubelet[2884]: I1216 12:15:20.621923 2884 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 12:15:20.626126 kubelet[2884]: I1216 12:15:20.626086 2884 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 16 12:15:20.627835 kubelet[2884]: E1216 12:15:20.627794 2884 kubelet.go:1555] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 16 12:15:20.631702 kubelet[2884]: I1216 12:15:20.631665 2884 volume_manager.go:297] "Starting Kubelet Volume Manager" Dec 16 12:15:20.631800 kubelet[2884]: I1216 12:15:20.631780 2884 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Dec 16 12:15:20.631920 kubelet[2884]: I1216 12:15:20.631900 2884 reconciler.go:26] "Reconciler: start to sync state" Dec 16 12:15:20.632249 kubelet[2884]: I1216 12:15:20.632224 2884 factory.go:221] Registration of the systemd container factory successfully Dec 16 12:15:20.632357 kubelet[2884]: I1216 12:15:20.632330 2884 factory.go:219] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 16 12:15:20.636195 kubelet[2884]: I1216 12:15:20.636168 2884 factory.go:221] Registration of the containerd container factory successfully Dec 16 12:15:20.646365 kubelet[2884]: I1216 12:15:20.646314 2884 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Dec 16 12:15:20.647735 kubelet[2884]: I1216 12:15:20.647712 2884 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Dec 16 12:15:20.647735 kubelet[2884]: I1216 12:15:20.647736 2884 status_manager.go:227] "Starting to sync pod status with apiserver" Dec 16 12:15:20.647868 kubelet[2884]: I1216 12:15:20.647754 2884 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 16 12:15:20.647868 kubelet[2884]: I1216 12:15:20.647762 2884 kubelet.go:2382] "Starting kubelet main sync loop" Dec 16 12:15:20.647868 kubelet[2884]: E1216 12:15:20.647805 2884 kubelet.go:2406] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 12:15:20.671844 kubelet[2884]: I1216 12:15:20.671816 2884 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 16 12:15:20.671844 kubelet[2884]: I1216 12:15:20.671836 2884 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 16 12:15:20.671968 kubelet[2884]: I1216 12:15:20.671875 2884 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:15:20.672036 kubelet[2884]: I1216 12:15:20.672018 2884 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 16 12:15:20.672064 kubelet[2884]: I1216 12:15:20.672034 2884 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 16 12:15:20.672064 kubelet[2884]: I1216 12:15:20.672052 2884 policy_none.go:49] "None policy: Start" Dec 16 12:15:20.672140 kubelet[2884]: I1216 12:15:20.672066 2884 memory_manager.go:186] "Starting memorymanager" policy="None" Dec 16 12:15:20.672140 kubelet[2884]: I1216 12:15:20.672086 2884 state_mem.go:35] "Initializing new in-memory state store" Dec 16 12:15:20.672202 kubelet[2884]: I1216 12:15:20.672187 2884 state_mem.go:75] "Updated machine memory state" Dec 16 12:15:20.675911 kubelet[2884]: I1216 12:15:20.675835 2884 manager.go:519] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Dec 16 12:15:20.676619 kubelet[2884]: I1216 12:15:20.676604 2884 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 12:15:20.676668 kubelet[2884]: I1216 12:15:20.676621 2884 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 12:15:20.677091 kubelet[2884]: I1216 12:15:20.676895 2884 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 12:15:20.677781 kubelet[2884]: E1216 12:15:20.677732 2884 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 16 12:15:20.748797 kubelet[2884]: I1216 12:15:20.748634 2884 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:20.748797 kubelet[2884]: I1216 12:15:20.748648 2884 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:20.748797 kubelet[2884]: I1216 12:15:20.748653 2884 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:20.782379 kubelet[2884]: I1216 12:15:20.782351 2884 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:20.790506 kubelet[2884]: I1216 12:15:20.790427 2884 kubelet_node_status.go:124] "Node was previously registered" node="ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:20.790812 kubelet[2884]: I1216 12:15:20.790797 2884 kubelet_node_status.go:78] "Successfully registered node" node="ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:20.833209 kubelet[2884]: I1216 12:15:20.833114 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/c916ee27207567fde06f7b695280ebf9-ca-certs\") pod \"kube-controller-manager-ci-4547-0-0-0-5b424f63c8\" (UID: \"c916ee27207567fde06f7b695280ebf9\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:20.833209 kubelet[2884]: I1216 12:15:20.833206 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c916ee27207567fde06f7b695280ebf9-kubeconfig\") pod \"kube-controller-manager-ci-4547-0-0-0-5b424f63c8\" (UID: \"c916ee27207567fde06f7b695280ebf9\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:20.833386 kubelet[2884]: I1216 12:15:20.833264 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/c916ee27207567fde06f7b695280ebf9-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4547-0-0-0-5b424f63c8\" (UID: \"c916ee27207567fde06f7b695280ebf9\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:20.833386 kubelet[2884]: I1216 12:15:20.833307 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f06bec92f4e666224e51bdee5ebb29cd-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4547-0-0-0-5b424f63c8\" (UID: \"f06bec92f4e666224e51bdee5ebb29cd\") " pod="kube-system/kube-apiserver-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:20.833386 kubelet[2884]: I1216 12:15:20.833339 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f06bec92f4e666224e51bdee5ebb29cd-k8s-certs\") pod \"kube-apiserver-ci-4547-0-0-0-5b424f63c8\" (UID: \"f06bec92f4e666224e51bdee5ebb29cd\") " pod="kube-system/kube-apiserver-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:20.833386 kubelet[2884]: I1216 12:15:20.833358 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/c916ee27207567fde06f7b695280ebf9-flexvolume-dir\") pod \"kube-controller-manager-ci-4547-0-0-0-5b424f63c8\" (UID: \"c916ee27207567fde06f7b695280ebf9\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:20.833386 kubelet[2884]: I1216 12:15:20.833375 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/c916ee27207567fde06f7b695280ebf9-k8s-certs\") pod \"kube-controller-manager-ci-4547-0-0-0-5b424f63c8\" (UID: \"c916ee27207567fde06f7b695280ebf9\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:20.833521 kubelet[2884]: I1216 12:15:20.833390 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/2d1d31f11f480b9db1ea84eba4a0cccf-kubeconfig\") pod \"kube-scheduler-ci-4547-0-0-0-5b424f63c8\" (UID: \"2d1d31f11f480b9db1ea84eba4a0cccf\") " pod="kube-system/kube-scheduler-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:20.833521 kubelet[2884]: I1216 12:15:20.833404 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f06bec92f4e666224e51bdee5ebb29cd-ca-certs\") pod \"kube-apiserver-ci-4547-0-0-0-5b424f63c8\" (UID: \"f06bec92f4e666224e51bdee5ebb29cd\") " pod="kube-system/kube-apiserver-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:21.608490 kubelet[2884]: I1216 12:15:21.608439 2884 apiserver.go:52] "Watching apiserver" Dec 16 12:15:21.632192 kubelet[2884]: I1216 12:15:21.632160 2884 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Dec 16 12:15:21.664435 kubelet[2884]: I1216 12:15:21.664404 2884 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:21.664727 kubelet[2884]: I1216 12:15:21.664706 2884 kubelet.go:3194] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:21.673110 kubelet[2884]: E1216 12:15:21.672152 2884 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4547-0-0-0-5b424f63c8\" already exists" pod="kube-system/kube-controller-manager-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:21.675061 kubelet[2884]: E1216 12:15:21.672386 2884 kubelet.go:3196] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4547-0-0-0-5b424f63c8\" already exists" pod="kube-system/kube-apiserver-ci-4547-0-0-0-5b424f63c8" Dec 16 12:15:21.725512 kubelet[2884]: I1216 12:15:21.725393 2884 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4547-0-0-0-5b424f63c8" podStartSLOduration=1.725374917 podStartE2EDuration="1.725374917s" podCreationTimestamp="2025-12-16 12:15:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:15:21.70823863 +0000 UTC m=+1.327232090" watchObservedRunningTime="2025-12-16 12:15:21.725374917 +0000 UTC m=+1.344368377" Dec 16 12:15:21.725659 kubelet[2884]: I1216 12:15:21.725620 2884 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4547-0-0-0-5b424f63c8" podStartSLOduration=1.725612798 podStartE2EDuration="1.725612798s" podCreationTimestamp="2025-12-16 12:15:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:15:21.725467358 +0000 UTC m=+1.344460818" watchObservedRunningTime="2025-12-16 12:15:21.725612798 +0000 UTC m=+1.344606258" Dec 16 12:15:21.744400 kubelet[2884]: I1216 12:15:21.744293 2884 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4547-0-0-0-5b424f63c8" podStartSLOduration=1.744278693 podStartE2EDuration="1.744278693s" podCreationTimestamp="2025-12-16 12:15:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:15:21.733913561 +0000 UTC m=+1.352907021" watchObservedRunningTime="2025-12-16 12:15:21.744278693 +0000 UTC m=+1.363272153" Dec 16 12:15:25.542024 kubelet[2884]: I1216 12:15:25.541986 2884 kuberuntime_manager.go:1702] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Dec 16 12:15:25.542793 kubelet[2884]: I1216 12:15:25.542638 2884 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Dec 16 12:15:25.542857 containerd[1680]: time="2025-12-16T12:15:25.542460984Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Dec 16 12:15:26.522449 systemd[1]: Created slice kubepods-besteffort-pod6ea469c1_4f88_42cf_a785_e011a82d4e27.slice - libcontainer container kubepods-besteffort-pod6ea469c1_4f88_42cf_a785_e011a82d4e27.slice. Dec 16 12:15:26.641004 systemd[1]: Created slice kubepods-besteffort-pod2adc4df3_4f5c_4709_800e_2d961848e0d0.slice - libcontainer container kubepods-besteffort-pod2adc4df3_4f5c_4709_800e_2d961848e0d0.slice. Dec 16 12:15:26.667042 kubelet[2884]: I1216 12:15:26.666993 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/6ea469c1-4f88-42cf-a785-e011a82d4e27-kube-proxy\") pod \"kube-proxy-fqbtk\" (UID: \"6ea469c1-4f88-42cf-a785-e011a82d4e27\") " pod="kube-system/kube-proxy-fqbtk" Dec 16 12:15:26.667418 kubelet[2884]: I1216 12:15:26.667066 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6ea469c1-4f88-42cf-a785-e011a82d4e27-lib-modules\") pod \"kube-proxy-fqbtk\" (UID: \"6ea469c1-4f88-42cf-a785-e011a82d4e27\") " pod="kube-system/kube-proxy-fqbtk" Dec 16 12:15:26.667418 kubelet[2884]: I1216 12:15:26.667125 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2dj8n\" (UniqueName: \"kubernetes.io/projected/6ea469c1-4f88-42cf-a785-e011a82d4e27-kube-api-access-2dj8n\") pod \"kube-proxy-fqbtk\" (UID: \"6ea469c1-4f88-42cf-a785-e011a82d4e27\") " pod="kube-system/kube-proxy-fqbtk" Dec 16 12:15:26.667418 kubelet[2884]: I1216 12:15:26.667149 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/6ea469c1-4f88-42cf-a785-e011a82d4e27-xtables-lock\") pod \"kube-proxy-fqbtk\" (UID: \"6ea469c1-4f88-42cf-a785-e011a82d4e27\") " pod="kube-system/kube-proxy-fqbtk" Dec 16 12:15:26.768467 kubelet[2884]: I1216 12:15:26.768280 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/2adc4df3-4f5c-4709-800e-2d961848e0d0-var-lib-calico\") pod \"tigera-operator-7dcd859c48-8ccrj\" (UID: \"2adc4df3-4f5c-4709-800e-2d961848e0d0\") " pod="tigera-operator/tigera-operator-7dcd859c48-8ccrj" Dec 16 12:15:26.768467 kubelet[2884]: I1216 12:15:26.768466 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xg4xk\" (UniqueName: \"kubernetes.io/projected/2adc4df3-4f5c-4709-800e-2d961848e0d0-kube-api-access-xg4xk\") pod \"tigera-operator-7dcd859c48-8ccrj\" (UID: \"2adc4df3-4f5c-4709-800e-2d961848e0d0\") " pod="tigera-operator/tigera-operator-7dcd859c48-8ccrj" Dec 16 12:15:26.831777 containerd[1680]: time="2025-12-16T12:15:26.831669079Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-fqbtk,Uid:6ea469c1-4f88-42cf-a785-e011a82d4e27,Namespace:kube-system,Attempt:0,}" Dec 16 12:15:26.854662 containerd[1680]: time="2025-12-16T12:15:26.854617636Z" level=info msg="connecting to shim c54ea329a163955226669c3e55645041df19fa176b70d3a8fccb6805d08c1b41" address="unix:///run/containerd/s/283d1e26ad81add0e0f4b5fed594544405336118bfa79aed3bff74479abe9e49" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:15:26.881328 systemd[1]: Started cri-containerd-c54ea329a163955226669c3e55645041df19fa176b70d3a8fccb6805d08c1b41.scope - libcontainer container c54ea329a163955226669c3e55645041df19fa176b70d3a8fccb6805d08c1b41. Dec 16 12:15:26.889000 audit: BPF prog-id=133 op=LOAD Dec 16 12:15:26.891284 kernel: kauditd_printk_skb: 32 callbacks suppressed Dec 16 12:15:26.891328 kernel: audit: type=1334 audit(1765887326.889:440): prog-id=133 op=LOAD Dec 16 12:15:26.890000 audit: BPF prog-id=134 op=LOAD Dec 16 12:15:26.892206 kernel: audit: type=1334 audit(1765887326.890:441): prog-id=134 op=LOAD Dec 16 12:15:26.892262 kernel: audit: type=1300 audit(1765887326.890:441): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2941 pid=2952 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:26.890000 audit[2952]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2941 pid=2952 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:26.890000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335346561333239613136333935353232363636396333653535363435 Dec 16 12:15:26.897860 kernel: audit: type=1327 audit(1765887326.890:441): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335346561333239613136333935353232363636396333653535363435 Dec 16 12:15:26.897956 kernel: audit: type=1334 audit(1765887326.890:442): prog-id=134 op=UNLOAD Dec 16 12:15:26.890000 audit: BPF prog-id=134 op=UNLOAD Dec 16 12:15:26.890000 audit[2952]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2941 pid=2952 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:26.901401 kernel: audit: type=1300 audit(1765887326.890:442): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2941 pid=2952 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:26.901477 kernel: audit: type=1327 audit(1765887326.890:442): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335346561333239613136333935353232363636396333653535363435 Dec 16 12:15:26.890000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335346561333239613136333935353232363636396333653535363435 Dec 16 12:15:26.890000 audit: BPF prog-id=135 op=LOAD Dec 16 12:15:26.890000 audit[2952]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2941 pid=2952 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:26.907651 kernel: audit: type=1334 audit(1765887326.890:443): prog-id=135 op=LOAD Dec 16 12:15:26.907742 kernel: audit: type=1300 audit(1765887326.890:443): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2941 pid=2952 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:26.890000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335346561333239613136333935353232363636396333653535363435 Dec 16 12:15:26.911298 kernel: audit: type=1327 audit(1765887326.890:443): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335346561333239613136333935353232363636396333653535363435 Dec 16 12:15:26.891000 audit: BPF prog-id=136 op=LOAD Dec 16 12:15:26.891000 audit[2952]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2941 pid=2952 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:26.891000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335346561333239613136333935353232363636396333653535363435 Dec 16 12:15:26.893000 audit: BPF prog-id=136 op=UNLOAD Dec 16 12:15:26.893000 audit[2952]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2941 pid=2952 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:26.893000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335346561333239613136333935353232363636396333653535363435 Dec 16 12:15:26.894000 audit: BPF prog-id=135 op=UNLOAD Dec 16 12:15:26.894000 audit[2952]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2941 pid=2952 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:26.894000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335346561333239613136333935353232363636396333653535363435 Dec 16 12:15:26.894000 audit: BPF prog-id=137 op=LOAD Dec 16 12:15:26.894000 audit[2952]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2941 pid=2952 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:26.894000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335346561333239613136333935353232363636396333653535363435 Dec 16 12:15:26.923206 containerd[1680]: time="2025-12-16T12:15:26.923048145Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-fqbtk,Uid:6ea469c1-4f88-42cf-a785-e011a82d4e27,Namespace:kube-system,Attempt:0,} returns sandbox id \"c54ea329a163955226669c3e55645041df19fa176b70d3a8fccb6805d08c1b41\"" Dec 16 12:15:26.927226 containerd[1680]: time="2025-12-16T12:15:26.927188086Z" level=info msg="CreateContainer within sandbox \"c54ea329a163955226669c3e55645041df19fa176b70d3a8fccb6805d08c1b41\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Dec 16 12:15:26.936569 containerd[1680]: time="2025-12-16T12:15:26.936520654Z" level=info msg="Container a94e71d63ec47aa2806f1e2f0738be5b0a0221cef94229b378e67cef2218bb58: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:15:26.945199 containerd[1680]: time="2025-12-16T12:15:26.945160058Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-8ccrj,Uid:2adc4df3-4f5c-4709-800e-2d961848e0d0,Namespace:tigera-operator,Attempt:0,}" Dec 16 12:15:26.946049 containerd[1680]: time="2025-12-16T12:15:26.945695060Z" level=info msg="CreateContainer within sandbox \"c54ea329a163955226669c3e55645041df19fa176b70d3a8fccb6805d08c1b41\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"a94e71d63ec47aa2806f1e2f0738be5b0a0221cef94229b378e67cef2218bb58\"" Dec 16 12:15:26.946974 containerd[1680]: time="2025-12-16T12:15:26.946884626Z" level=info msg="StartContainer for \"a94e71d63ec47aa2806f1e2f0738be5b0a0221cef94229b378e67cef2218bb58\"" Dec 16 12:15:26.948454 containerd[1680]: time="2025-12-16T12:15:26.948365234Z" level=info msg="connecting to shim a94e71d63ec47aa2806f1e2f0738be5b0a0221cef94229b378e67cef2218bb58" address="unix:///run/containerd/s/283d1e26ad81add0e0f4b5fed594544405336118bfa79aed3bff74479abe9e49" protocol=ttrpc version=3 Dec 16 12:15:26.965360 systemd[1]: Started cri-containerd-a94e71d63ec47aa2806f1e2f0738be5b0a0221cef94229b378e67cef2218bb58.scope - libcontainer container a94e71d63ec47aa2806f1e2f0738be5b0a0221cef94229b378e67cef2218bb58. Dec 16 12:15:26.970765 containerd[1680]: time="2025-12-16T12:15:26.970718588Z" level=info msg="connecting to shim 9b35ee0ade21bcf96619a0b05464c3e178eef1b8c3e1f9cefebcf6f4c997e6eb" address="unix:///run/containerd/s/507b8e916ef734dd03db135afa76061a073fa61b5ca391837baf8835da4b438b" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:15:26.996510 systemd[1]: Started cri-containerd-9b35ee0ade21bcf96619a0b05464c3e178eef1b8c3e1f9cefebcf6f4c997e6eb.scope - libcontainer container 9b35ee0ade21bcf96619a0b05464c3e178eef1b8c3e1f9cefebcf6f4c997e6eb. Dec 16 12:15:27.004000 audit: BPF prog-id=138 op=LOAD Dec 16 12:15:27.004000 audit[2978]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001283e8 a2=98 a3=0 items=0 ppid=2941 pid=2978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.004000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139346537316436336563343761613238303666316532663037333862 Dec 16 12:15:27.004000 audit: BPF prog-id=139 op=LOAD Dec 16 12:15:27.004000 audit[2978]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000128168 a2=98 a3=0 items=0 ppid=2941 pid=2978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.004000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139346537316436336563343761613238303666316532663037333862 Dec 16 12:15:27.004000 audit: BPF prog-id=139 op=UNLOAD Dec 16 12:15:27.004000 audit[2978]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2941 pid=2978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.004000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139346537316436336563343761613238303666316532663037333862 Dec 16 12:15:27.004000 audit: BPF prog-id=138 op=UNLOAD Dec 16 12:15:27.004000 audit[2978]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2941 pid=2978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.004000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139346537316436336563343761613238303666316532663037333862 Dec 16 12:15:27.004000 audit: BPF prog-id=140 op=LOAD Dec 16 12:15:27.004000 audit[2978]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000128648 a2=98 a3=0 items=0 ppid=2941 pid=2978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.004000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6139346537316436336563343761613238303666316532663037333862 Dec 16 12:15:27.010000 audit: BPF prog-id=141 op=LOAD Dec 16 12:15:27.010000 audit: BPF prog-id=142 op=LOAD Dec 16 12:15:27.010000 audit[3017]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=3000 pid=3017 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.010000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962333565653061646532316263663936363139613062303534363463 Dec 16 12:15:27.010000 audit: BPF prog-id=142 op=UNLOAD Dec 16 12:15:27.010000 audit[3017]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3000 pid=3017 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.010000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962333565653061646532316263663936363139613062303534363463 Dec 16 12:15:27.011000 audit: BPF prog-id=143 op=LOAD Dec 16 12:15:27.011000 audit[3017]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3000 pid=3017 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.011000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962333565653061646532316263663936363139613062303534363463 Dec 16 12:15:27.011000 audit: BPF prog-id=144 op=LOAD Dec 16 12:15:27.011000 audit[3017]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3000 pid=3017 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.011000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962333565653061646532316263663936363139613062303534363463 Dec 16 12:15:27.011000 audit: BPF prog-id=144 op=UNLOAD Dec 16 12:15:27.011000 audit[3017]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3000 pid=3017 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.011000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962333565653061646532316263663936363139613062303534363463 Dec 16 12:15:27.011000 audit: BPF prog-id=143 op=UNLOAD Dec 16 12:15:27.011000 audit[3017]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3000 pid=3017 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.011000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962333565653061646532316263663936363139613062303534363463 Dec 16 12:15:27.011000 audit: BPF prog-id=145 op=LOAD Dec 16 12:15:27.011000 audit[3017]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=3000 pid=3017 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.011000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962333565653061646532316263663936363139613062303534363463 Dec 16 12:15:27.026948 containerd[1680]: time="2025-12-16T12:15:27.026908835Z" level=info msg="StartContainer for \"a94e71d63ec47aa2806f1e2f0738be5b0a0221cef94229b378e67cef2218bb58\" returns successfully" Dec 16 12:15:27.038851 containerd[1680]: time="2025-12-16T12:15:27.038645654Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-8ccrj,Uid:2adc4df3-4f5c-4709-800e-2d961848e0d0,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"9b35ee0ade21bcf96619a0b05464c3e178eef1b8c3e1f9cefebcf6f4c997e6eb\"" Dec 16 12:15:27.041254 containerd[1680]: time="2025-12-16T12:15:27.041211987Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Dec 16 12:15:27.176000 audit[3089]: NETFILTER_CFG table=mangle:54 family=2 entries=1 op=nft_register_chain pid=3089 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:27.176000 audit[3089]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffffa2ed820 a2=0 a3=1 items=0 ppid=2997 pid=3089 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.176000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Dec 16 12:15:27.176000 audit[3090]: NETFILTER_CFG table=mangle:55 family=10 entries=1 op=nft_register_chain pid=3090 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:27.176000 audit[3090]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc4a25520 a2=0 a3=1 items=0 ppid=2997 pid=3090 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.176000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Dec 16 12:15:27.177000 audit[3092]: NETFILTER_CFG table=nat:56 family=2 entries=1 op=nft_register_chain pid=3092 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:27.177000 audit[3092]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd0fc9500 a2=0 a3=1 items=0 ppid=2997 pid=3092 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.177000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Dec 16 12:15:27.179000 audit[3095]: NETFILTER_CFG table=filter:57 family=2 entries=1 op=nft_register_chain pid=3095 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:27.179000 audit[3095]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff49698e0 a2=0 a3=1 items=0 ppid=2997 pid=3095 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.179000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Dec 16 12:15:27.183000 audit[3094]: NETFILTER_CFG table=nat:58 family=10 entries=1 op=nft_register_chain pid=3094 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:27.183000 audit[3094]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff2c10470 a2=0 a3=1 items=0 ppid=2997 pid=3094 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.183000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Dec 16 12:15:27.185000 audit[3096]: NETFILTER_CFG table=filter:59 family=10 entries=1 op=nft_register_chain pid=3096 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:27.185000 audit[3096]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffd7b05290 a2=0 a3=1 items=0 ppid=2997 pid=3096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.185000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Dec 16 12:15:27.282000 audit[3097]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3097 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:27.282000 audit[3097]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=fffff911a460 a2=0 a3=1 items=0 ppid=2997 pid=3097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.282000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Dec 16 12:15:27.284000 audit[3099]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3099 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:27.284000 audit[3099]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=fffff6700170 a2=0 a3=1 items=0 ppid=2997 pid=3099 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.284000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Dec 16 12:15:27.287000 audit[3102]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3102 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:27.287000 audit[3102]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffc9ae2190 a2=0 a3=1 items=0 ppid=2997 pid=3102 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.287000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Dec 16 12:15:27.289000 audit[3103]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3103 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:27.289000 audit[3103]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffcbb7c250 a2=0 a3=1 items=0 ppid=2997 pid=3103 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.289000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Dec 16 12:15:27.291000 audit[3105]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3105 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:27.291000 audit[3105]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffcea67870 a2=0 a3=1 items=0 ppid=2997 pid=3105 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.291000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Dec 16 12:15:27.293000 audit[3106]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3106 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:27.293000 audit[3106]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc3a6e7c0 a2=0 a3=1 items=0 ppid=2997 pid=3106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.293000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Dec 16 12:15:27.296000 audit[3108]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3108 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:27.296000 audit[3108]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffdef5a570 a2=0 a3=1 items=0 ppid=2997 pid=3108 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.296000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Dec 16 12:15:27.300000 audit[3111]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3111 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:27.300000 audit[3111]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=fffffd8f0980 a2=0 a3=1 items=0 ppid=2997 pid=3111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.300000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Dec 16 12:15:27.301000 audit[3112]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3112 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:27.301000 audit[3112]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe2f95970 a2=0 a3=1 items=0 ppid=2997 pid=3112 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.301000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Dec 16 12:15:27.304000 audit[3114]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3114 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:27.304000 audit[3114]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffd8473920 a2=0 a3=1 items=0 ppid=2997 pid=3114 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.304000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Dec 16 12:15:27.305000 audit[3115]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3115 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:27.305000 audit[3115]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff2c87080 a2=0 a3=1 items=0 ppid=2997 pid=3115 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.305000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Dec 16 12:15:27.308000 audit[3117]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3117 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:27.308000 audit[3117]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffe29deb80 a2=0 a3=1 items=0 ppid=2997 pid=3117 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.308000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Dec 16 12:15:27.311000 audit[3120]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3120 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:27.311000 audit[3120]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffc32ef5e0 a2=0 a3=1 items=0 ppid=2997 pid=3120 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.311000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Dec 16 12:15:27.315000 audit[3123]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3123 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:27.315000 audit[3123]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffd8022be0 a2=0 a3=1 items=0 ppid=2997 pid=3123 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.315000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Dec 16 12:15:27.317000 audit[3124]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3124 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:27.317000 audit[3124]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffd3d10f80 a2=0 a3=1 items=0 ppid=2997 pid=3124 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.317000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Dec 16 12:15:27.319000 audit[3126]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3126 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:27.319000 audit[3126]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=ffffe9191420 a2=0 a3=1 items=0 ppid=2997 pid=3126 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.319000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:15:27.323000 audit[3129]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3129 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:27.323000 audit[3129]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffe940e100 a2=0 a3=1 items=0 ppid=2997 pid=3129 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.323000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:15:27.324000 audit[3130]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3130 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:27.324000 audit[3130]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffed4d51a0 a2=0 a3=1 items=0 ppid=2997 pid=3130 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.324000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Dec 16 12:15:27.327000 audit[3132]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3132 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:15:27.327000 audit[3132]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=532 a0=3 a1=ffffc3ddc8e0 a2=0 a3=1 items=0 ppid=2997 pid=3132 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.327000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Dec 16 12:15:27.349000 audit[3138]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3138 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:27.349000 audit[3138]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=fffffc1fd320 a2=0 a3=1 items=0 ppid=2997 pid=3138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.349000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:27.365000 audit[3138]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3138 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:27.365000 audit[3138]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5508 a0=3 a1=fffffc1fd320 a2=0 a3=1 items=0 ppid=2997 pid=3138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.365000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:27.366000 audit[3143]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3143 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:27.366000 audit[3143]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=ffffd5289a10 a2=0 a3=1 items=0 ppid=2997 pid=3143 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.366000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Dec 16 12:15:27.369000 audit[3145]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3145 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:27.369000 audit[3145]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=836 a0=3 a1=ffffc04ca880 a2=0 a3=1 items=0 ppid=2997 pid=3145 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.369000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Dec 16 12:15:27.372000 audit[3148]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3148 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:27.372000 audit[3148]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=fffff87935f0 a2=0 a3=1 items=0 ppid=2997 pid=3148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.372000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Dec 16 12:15:27.373000 audit[3149]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3149 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:27.373000 audit[3149]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe90383a0 a2=0 a3=1 items=0 ppid=2997 pid=3149 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.373000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Dec 16 12:15:27.375000 audit[3151]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3151 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:27.375000 audit[3151]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffdfe95f50 a2=0 a3=1 items=0 ppid=2997 pid=3151 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.375000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Dec 16 12:15:27.376000 audit[3152]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3152 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:27.376000 audit[3152]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc4e86df0 a2=0 a3=1 items=0 ppid=2997 pid=3152 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.376000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Dec 16 12:15:27.379000 audit[3154]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3154 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:27.379000 audit[3154]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffc23c8950 a2=0 a3=1 items=0 ppid=2997 pid=3154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.379000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Dec 16 12:15:27.382000 audit[3157]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3157 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:27.382000 audit[3157]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=828 a0=3 a1=ffffe96e50b0 a2=0 a3=1 items=0 ppid=2997 pid=3157 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.382000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Dec 16 12:15:27.384000 audit[3158]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3158 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:27.384000 audit[3158]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffcf4d0b60 a2=0 a3=1 items=0 ppid=2997 pid=3158 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.384000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Dec 16 12:15:27.386000 audit[3160]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3160 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:27.386000 audit[3160]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffdfe8db40 a2=0 a3=1 items=0 ppid=2997 pid=3160 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.386000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Dec 16 12:15:27.388000 audit[3161]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3161 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:27.388000 audit[3161]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff0760470 a2=0 a3=1 items=0 ppid=2997 pid=3161 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.388000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Dec 16 12:15:27.390000 audit[3163]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3163 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:27.390000 audit[3163]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffff9ec99a0 a2=0 a3=1 items=0 ppid=2997 pid=3163 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.390000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Dec 16 12:15:27.394000 audit[3166]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3166 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:27.394000 audit[3166]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffffecb920 a2=0 a3=1 items=0 ppid=2997 pid=3166 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.394000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Dec 16 12:15:27.397000 audit[3169]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3169 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:27.397000 audit[3169]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffff820b940 a2=0 a3=1 items=0 ppid=2997 pid=3169 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.397000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Dec 16 12:15:27.399000 audit[3170]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3170 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:27.399000 audit[3170]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffd717bc90 a2=0 a3=1 items=0 ppid=2997 pid=3170 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.399000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Dec 16 12:15:27.401000 audit[3172]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3172 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:27.401000 audit[3172]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=ffffcebbb040 a2=0 a3=1 items=0 ppid=2997 pid=3172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.401000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:15:27.404000 audit[3175]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3175 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:27.404000 audit[3175]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffc2beb8e0 a2=0 a3=1 items=0 ppid=2997 pid=3175 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.404000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:15:27.405000 audit[3176]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3176 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:27.405000 audit[3176]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe842b960 a2=0 a3=1 items=0 ppid=2997 pid=3176 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.405000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Dec 16 12:15:27.408000 audit[3178]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3178 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:27.408000 audit[3178]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=612 a0=3 a1=ffffc3fdd3e0 a2=0 a3=1 items=0 ppid=2997 pid=3178 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.408000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Dec 16 12:15:27.409000 audit[3179]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3179 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:27.409000 audit[3179]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff710a230 a2=0 a3=1 items=0 ppid=2997 pid=3179 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.409000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Dec 16 12:15:27.411000 audit[3181]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3181 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:27.411000 audit[3181]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffc481ef30 a2=0 a3=1 items=0 ppid=2997 pid=3181 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.411000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:15:27.414000 audit[3184]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3184 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:15:27.414000 audit[3184]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffd62d9a10 a2=0 a3=1 items=0 ppid=2997 pid=3184 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.414000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:15:27.417000 audit[3186]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3186 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Dec 16 12:15:27.417000 audit[3186]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2088 a0=3 a1=ffffe49b1c10 a2=0 a3=1 items=0 ppid=2997 pid=3186 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.417000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:27.418000 audit[3186]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3186 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Dec 16 12:15:27.418000 audit[3186]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2056 a0=3 a1=ffffe49b1c10 a2=0 a3=1 items=0 ppid=2997 pid=3186 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:27.418000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:27.782484 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2922684639.mount: Deactivated successfully. Dec 16 12:15:28.071667 kubelet[2884]: I1216 12:15:28.071536 2884 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-fqbtk" podStartSLOduration=2.071517402 podStartE2EDuration="2.071517402s" podCreationTimestamp="2025-12-16 12:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:15:27.688381288 +0000 UTC m=+7.307374748" watchObservedRunningTime="2025-12-16 12:15:28.071517402 +0000 UTC m=+7.690510862" Dec 16 12:15:28.880131 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1238035037.mount: Deactivated successfully. Dec 16 12:15:29.432412 containerd[1680]: time="2025-12-16T12:15:29.432176661Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:15:29.436009 containerd[1680]: time="2025-12-16T12:15:29.435915160Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=20773434" Dec 16 12:15:29.437926 containerd[1680]: time="2025-12-16T12:15:29.437879170Z" level=info msg="ImageCreate event name:\"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:15:29.440508 containerd[1680]: time="2025-12-16T12:15:29.440461903Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:15:29.441281 containerd[1680]: time="2025-12-16T12:15:29.441240907Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"22147999\" in 2.399995359s" Dec 16 12:15:29.441341 containerd[1680]: time="2025-12-16T12:15:29.441282147Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\"" Dec 16 12:15:29.443622 containerd[1680]: time="2025-12-16T12:15:29.443493479Z" level=info msg="CreateContainer within sandbox \"9b35ee0ade21bcf96619a0b05464c3e178eef1b8c3e1f9cefebcf6f4c997e6eb\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Dec 16 12:15:29.455905 containerd[1680]: time="2025-12-16T12:15:29.455855462Z" level=info msg="Container 7082627ca7b9b57089caf7086f20d7bf4f72746902a62d7e893afeb78ceb58b5: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:15:29.464250 containerd[1680]: time="2025-12-16T12:15:29.464185704Z" level=info msg="CreateContainer within sandbox \"9b35ee0ade21bcf96619a0b05464c3e178eef1b8c3e1f9cefebcf6f4c997e6eb\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"7082627ca7b9b57089caf7086f20d7bf4f72746902a62d7e893afeb78ceb58b5\"" Dec 16 12:15:29.464955 containerd[1680]: time="2025-12-16T12:15:29.464892948Z" level=info msg="StartContainer for \"7082627ca7b9b57089caf7086f20d7bf4f72746902a62d7e893afeb78ceb58b5\"" Dec 16 12:15:29.466067 containerd[1680]: time="2025-12-16T12:15:29.466018514Z" level=info msg="connecting to shim 7082627ca7b9b57089caf7086f20d7bf4f72746902a62d7e893afeb78ceb58b5" address="unix:///run/containerd/s/507b8e916ef734dd03db135afa76061a073fa61b5ca391837baf8835da4b438b" protocol=ttrpc version=3 Dec 16 12:15:29.490609 systemd[1]: Started cri-containerd-7082627ca7b9b57089caf7086f20d7bf4f72746902a62d7e893afeb78ceb58b5.scope - libcontainer container 7082627ca7b9b57089caf7086f20d7bf4f72746902a62d7e893afeb78ceb58b5. Dec 16 12:15:29.500000 audit: BPF prog-id=146 op=LOAD Dec 16 12:15:29.500000 audit: BPF prog-id=147 op=LOAD Dec 16 12:15:29.500000 audit[3195]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3000 pid=3195 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:29.500000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3730383236323763613762396235373038396361663730383666323064 Dec 16 12:15:29.501000 audit: BPF prog-id=147 op=UNLOAD Dec 16 12:15:29.501000 audit[3195]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3000 pid=3195 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:29.501000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3730383236323763613762396235373038396361663730383666323064 Dec 16 12:15:29.501000 audit: BPF prog-id=148 op=LOAD Dec 16 12:15:29.501000 audit[3195]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3000 pid=3195 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:29.501000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3730383236323763613762396235373038396361663730383666323064 Dec 16 12:15:29.501000 audit: BPF prog-id=149 op=LOAD Dec 16 12:15:29.501000 audit[3195]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3000 pid=3195 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:29.501000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3730383236323763613762396235373038396361663730383666323064 Dec 16 12:15:29.501000 audit: BPF prog-id=149 op=UNLOAD Dec 16 12:15:29.501000 audit[3195]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3000 pid=3195 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:29.501000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3730383236323763613762396235373038396361663730383666323064 Dec 16 12:15:29.501000 audit: BPF prog-id=148 op=UNLOAD Dec 16 12:15:29.501000 audit[3195]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3000 pid=3195 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:29.501000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3730383236323763613762396235373038396361663730383666323064 Dec 16 12:15:29.502000 audit: BPF prog-id=150 op=LOAD Dec 16 12:15:29.502000 audit[3195]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3000 pid=3195 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:29.502000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3730383236323763613762396235373038396361663730383666323064 Dec 16 12:15:29.518592 containerd[1680]: time="2025-12-16T12:15:29.518554742Z" level=info msg="StartContainer for \"7082627ca7b9b57089caf7086f20d7bf4f72746902a62d7e893afeb78ceb58b5\" returns successfully" Dec 16 12:15:33.511853 kubelet[2884]: I1216 12:15:33.511768 2884 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-8ccrj" podStartSLOduration=5.109491375 podStartE2EDuration="7.511753586s" podCreationTimestamp="2025-12-16 12:15:26 +0000 UTC" firstStartedPulling="2025-12-16 12:15:27.03977058 +0000 UTC m=+6.658764040" lastFinishedPulling="2025-12-16 12:15:29.442032791 +0000 UTC m=+9.061026251" observedRunningTime="2025-12-16 12:15:29.696583489 +0000 UTC m=+9.315576949" watchObservedRunningTime="2025-12-16 12:15:33.511753586 +0000 UTC m=+13.130747046" Dec 16 12:15:34.555359 sudo[1926]: pam_unix(sudo:session): session closed for user root Dec 16 12:15:34.554000 audit[1926]: USER_END pid=1926 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:15:34.558353 kernel: kauditd_printk_skb: 224 callbacks suppressed Dec 16 12:15:34.558421 kernel: audit: type=1106 audit(1765887334.554:520): pid=1926 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:15:34.554000 audit[1926]: CRED_DISP pid=1926 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:15:34.562315 kernel: audit: type=1104 audit(1765887334.554:521): pid=1926 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:15:34.721322 sshd[1925]: Connection closed by 139.178.68.195 port 57592 Dec 16 12:15:34.725031 sshd-session[1921]: pam_unix(sshd:session): session closed for user core Dec 16 12:15:34.725000 audit[1921]: USER_END pid=1921 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:34.725000 audit[1921]: CRED_DISP pid=1921 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:34.732457 systemd[1]: sshd@6-10.0.21.180:22-139.178.68.195:57592.service: Deactivated successfully. Dec 16 12:15:34.732768 kernel: audit: type=1106 audit(1765887334.725:522): pid=1921 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:34.732795 kernel: audit: type=1104 audit(1765887334.725:523): pid=1921 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:34.733000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-10.0.21.180:22-139.178.68.195:57592 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:34.735715 systemd[1]: session-8.scope: Deactivated successfully. Dec 16 12:15:34.735966 systemd[1]: session-8.scope: Consumed 7.416s CPU time, 223.7M memory peak. Dec 16 12:15:34.737122 kernel: audit: type=1131 audit(1765887334.733:524): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-10.0.21.180:22-139.178.68.195:57592 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:34.738286 systemd-logind[1646]: Session 8 logged out. Waiting for processes to exit. Dec 16 12:15:34.739932 systemd-logind[1646]: Removed session 8. Dec 16 12:15:36.346000 audit[3286]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3286 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:36.353787 kernel: audit: type=1325 audit(1765887336.346:525): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3286 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:36.353886 kernel: audit: type=1300 audit(1765887336.346:525): arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffcb1fd4d0 a2=0 a3=1 items=0 ppid=2997 pid=3286 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:36.346000 audit[3286]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffcb1fd4d0 a2=0 a3=1 items=0 ppid=2997 pid=3286 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:36.346000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:36.356692 kernel: audit: type=1327 audit(1765887336.346:525): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:36.350000 audit[3286]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3286 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:36.358485 kernel: audit: type=1325 audit(1765887336.350:526): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3286 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:36.350000 audit[3286]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffcb1fd4d0 a2=0 a3=1 items=0 ppid=2997 pid=3286 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:36.363772 kernel: audit: type=1300 audit(1765887336.350:526): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffcb1fd4d0 a2=0 a3=1 items=0 ppid=2997 pid=3286 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:36.350000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:36.383000 audit[3288]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3288 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:36.383000 audit[3288]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffe8e3edf0 a2=0 a3=1 items=0 ppid=2997 pid=3288 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:36.383000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:36.389000 audit[3288]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3288 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:36.389000 audit[3288]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffe8e3edf0 a2=0 a3=1 items=0 ppid=2997 pid=3288 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:36.389000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:39.250000 audit[3290]: NETFILTER_CFG table=filter:109 family=2 entries=18 op=nft_register_rule pid=3290 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:39.250000 audit[3290]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=fffff0ed5320 a2=0 a3=1 items=0 ppid=2997 pid=3290 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:39.250000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:39.256000 audit[3290]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3290 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:39.256000 audit[3290]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff0ed5320 a2=0 a3=1 items=0 ppid=2997 pid=3290 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:39.256000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:39.285000 audit[3292]: NETFILTER_CFG table=filter:111 family=2 entries=20 op=nft_register_rule pid=3292 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:39.285000 audit[3292]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffd8826200 a2=0 a3=1 items=0 ppid=2997 pid=3292 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:39.285000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:39.290000 audit[3292]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3292 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:39.290000 audit[3292]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffd8826200 a2=0 a3=1 items=0 ppid=2997 pid=3292 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:39.290000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:41.482000 audit[3295]: NETFILTER_CFG table=filter:113 family=2 entries=21 op=nft_register_rule pid=3295 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:41.484327 kernel: kauditd_printk_skb: 19 callbacks suppressed Dec 16 12:15:41.484379 kernel: audit: type=1325 audit(1765887341.482:533): table=filter:113 family=2 entries=21 op=nft_register_rule pid=3295 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:41.482000 audit[3295]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffe6248aa0 a2=0 a3=1 items=0 ppid=2997 pid=3295 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:41.490177 kernel: audit: type=1300 audit(1765887341.482:533): arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffe6248aa0 a2=0 a3=1 items=0 ppid=2997 pid=3295 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:41.482000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:41.493846 kernel: audit: type=1327 audit(1765887341.482:533): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:41.491000 audit[3295]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3295 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:41.496771 kernel: audit: type=1325 audit(1765887341.491:534): table=nat:114 family=2 entries=12 op=nft_register_rule pid=3295 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:41.491000 audit[3295]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffe6248aa0 a2=0 a3=1 items=0 ppid=2997 pid=3295 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:41.500777 kernel: audit: type=1300 audit(1765887341.491:534): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffe6248aa0 a2=0 a3=1 items=0 ppid=2997 pid=3295 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:41.500994 kernel: audit: type=1327 audit(1765887341.491:534): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:41.491000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:41.517633 systemd[1]: Created slice kubepods-besteffort-pod844a40f5_7cb7_4039_b9b2_e01b22e43570.slice - libcontainer container kubepods-besteffort-pod844a40f5_7cb7_4039_b9b2_e01b22e43570.slice. Dec 16 12:15:41.516000 audit[3297]: NETFILTER_CFG table=filter:115 family=2 entries=22 op=nft_register_rule pid=3297 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:41.516000 audit[3297]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffd63bd020 a2=0 a3=1 items=0 ppid=2997 pid=3297 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:41.522711 kernel: audit: type=1325 audit(1765887341.516:535): table=filter:115 family=2 entries=22 op=nft_register_rule pid=3297 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:41.522757 kernel: audit: type=1300 audit(1765887341.516:535): arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffd63bd020 a2=0 a3=1 items=0 ppid=2997 pid=3297 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:41.516000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:41.524000 audit[3297]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3297 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:41.528760 kernel: audit: type=1327 audit(1765887341.516:535): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:41.528845 kernel: audit: type=1325 audit(1765887341.524:536): table=nat:116 family=2 entries=12 op=nft_register_rule pid=3297 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:41.524000 audit[3297]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffd63bd020 a2=0 a3=1 items=0 ppid=2997 pid=3297 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:41.524000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:41.553847 kubelet[2884]: I1216 12:15:41.553792 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/844a40f5-7cb7-4039-b9b2-e01b22e43570-tigera-ca-bundle\") pod \"calico-typha-69b4fdfc45-t24v9\" (UID: \"844a40f5-7cb7-4039-b9b2-e01b22e43570\") " pod="calico-system/calico-typha-69b4fdfc45-t24v9" Dec 16 12:15:41.553847 kubelet[2884]: I1216 12:15:41.553835 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tlzkv\" (UniqueName: \"kubernetes.io/projected/844a40f5-7cb7-4039-b9b2-e01b22e43570-kube-api-access-tlzkv\") pod \"calico-typha-69b4fdfc45-t24v9\" (UID: \"844a40f5-7cb7-4039-b9b2-e01b22e43570\") " pod="calico-system/calico-typha-69b4fdfc45-t24v9" Dec 16 12:15:41.553847 kubelet[2884]: I1216 12:15:41.553854 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/844a40f5-7cb7-4039-b9b2-e01b22e43570-typha-certs\") pod \"calico-typha-69b4fdfc45-t24v9\" (UID: \"844a40f5-7cb7-4039-b9b2-e01b22e43570\") " pod="calico-system/calico-typha-69b4fdfc45-t24v9" Dec 16 12:15:41.702815 systemd[1]: Created slice kubepods-besteffort-poda546a773_f93e_4e94_979b_b999641f0781.slice - libcontainer container kubepods-besteffort-poda546a773_f93e_4e94_979b_b999641f0781.slice. Dec 16 12:15:41.756231 kubelet[2884]: I1216 12:15:41.755961 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qt6gv\" (UniqueName: \"kubernetes.io/projected/a546a773-f93e-4e94-979b-b999641f0781-kube-api-access-qt6gv\") pod \"calico-node-t8f6s\" (UID: \"a546a773-f93e-4e94-979b-b999641f0781\") " pod="calico-system/calico-node-t8f6s" Dec 16 12:15:41.756231 kubelet[2884]: I1216 12:15:41.756025 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a546a773-f93e-4e94-979b-b999641f0781-lib-modules\") pod \"calico-node-t8f6s\" (UID: \"a546a773-f93e-4e94-979b-b999641f0781\") " pod="calico-system/calico-node-t8f6s" Dec 16 12:15:41.756231 kubelet[2884]: I1216 12:15:41.756047 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/a546a773-f93e-4e94-979b-b999641f0781-node-certs\") pod \"calico-node-t8f6s\" (UID: \"a546a773-f93e-4e94-979b-b999641f0781\") " pod="calico-system/calico-node-t8f6s" Dec 16 12:15:41.756231 kubelet[2884]: I1216 12:15:41.756063 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/a546a773-f93e-4e94-979b-b999641f0781-var-run-calico\") pod \"calico-node-t8f6s\" (UID: \"a546a773-f93e-4e94-979b-b999641f0781\") " pod="calico-system/calico-node-t8f6s" Dec 16 12:15:41.756231 kubelet[2884]: I1216 12:15:41.756130 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/a546a773-f93e-4e94-979b-b999641f0781-var-lib-calico\") pod \"calico-node-t8f6s\" (UID: \"a546a773-f93e-4e94-979b-b999641f0781\") " pod="calico-system/calico-node-t8f6s" Dec 16 12:15:41.756509 kubelet[2884]: I1216 12:15:41.756150 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/a546a773-f93e-4e94-979b-b999641f0781-xtables-lock\") pod \"calico-node-t8f6s\" (UID: \"a546a773-f93e-4e94-979b-b999641f0781\") " pod="calico-system/calico-node-t8f6s" Dec 16 12:15:41.756509 kubelet[2884]: I1216 12:15:41.756193 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/a546a773-f93e-4e94-979b-b999641f0781-cni-bin-dir\") pod \"calico-node-t8f6s\" (UID: \"a546a773-f93e-4e94-979b-b999641f0781\") " pod="calico-system/calico-node-t8f6s" Dec 16 12:15:41.756509 kubelet[2884]: I1216 12:15:41.756207 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/a546a773-f93e-4e94-979b-b999641f0781-cni-net-dir\") pod \"calico-node-t8f6s\" (UID: \"a546a773-f93e-4e94-979b-b999641f0781\") " pod="calico-system/calico-node-t8f6s" Dec 16 12:15:41.757189 kubelet[2884]: I1216 12:15:41.757021 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/a546a773-f93e-4e94-979b-b999641f0781-policysync\") pod \"calico-node-t8f6s\" (UID: \"a546a773-f93e-4e94-979b-b999641f0781\") " pod="calico-system/calico-node-t8f6s" Dec 16 12:15:41.757189 kubelet[2884]: I1216 12:15:41.757189 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/a546a773-f93e-4e94-979b-b999641f0781-cni-log-dir\") pod \"calico-node-t8f6s\" (UID: \"a546a773-f93e-4e94-979b-b999641f0781\") " pod="calico-system/calico-node-t8f6s" Dec 16 12:15:41.757291 kubelet[2884]: I1216 12:15:41.757210 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/a546a773-f93e-4e94-979b-b999641f0781-flexvol-driver-host\") pod \"calico-node-t8f6s\" (UID: \"a546a773-f93e-4e94-979b-b999641f0781\") " pod="calico-system/calico-node-t8f6s" Dec 16 12:15:41.757291 kubelet[2884]: I1216 12:15:41.757230 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a546a773-f93e-4e94-979b-b999641f0781-tigera-ca-bundle\") pod \"calico-node-t8f6s\" (UID: \"a546a773-f93e-4e94-979b-b999641f0781\") " pod="calico-system/calico-node-t8f6s" Dec 16 12:15:41.822790 containerd[1680]: time="2025-12-16T12:15:41.822714290Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-69b4fdfc45-t24v9,Uid:844a40f5-7cb7-4039-b9b2-e01b22e43570,Namespace:calico-system,Attempt:0,}" Dec 16 12:15:41.847005 containerd[1680]: time="2025-12-16T12:15:41.846548612Z" level=info msg="connecting to shim 11c89bee84b274c7f9594c06a3bcb6534e8eaf6e6a1cacf97204e8ba13f83cf0" address="unix:///run/containerd/s/e44db5688cbfa9a7ba8b76b0c9227f9f3e87e3c0102c579a4ca4f7e7afb2feff" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:15:41.860425 kubelet[2884]: E1216 12:15:41.860348 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.860615 kubelet[2884]: W1216 12:15:41.860595 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.861319 kubelet[2884]: E1216 12:15:41.861291 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.871221 kubelet[2884]: E1216 12:15:41.871140 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.871221 kubelet[2884]: W1216 12:15:41.871163 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.871221 kubelet[2884]: E1216 12:15:41.871182 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.881397 systemd[1]: Started cri-containerd-11c89bee84b274c7f9594c06a3bcb6534e8eaf6e6a1cacf97204e8ba13f83cf0.scope - libcontainer container 11c89bee84b274c7f9594c06a3bcb6534e8eaf6e6a1cacf97204e8ba13f83cf0. Dec 16 12:15:41.887104 kubelet[2884]: E1216 12:15:41.886648 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.887104 kubelet[2884]: W1216 12:15:41.886688 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.887104 kubelet[2884]: E1216 12:15:41.886711 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.896884 kubelet[2884]: E1216 12:15:41.896790 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2hkqn" podUID="504cf836-455c-42a5-8d68-245e5d4890cf" Dec 16 12:15:41.906000 audit: BPF prog-id=151 op=LOAD Dec 16 12:15:41.907000 audit: BPF prog-id=152 op=LOAD Dec 16 12:15:41.907000 audit[3318]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0180 a2=98 a3=0 items=0 ppid=3308 pid=3318 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:41.907000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3131633839626565383462323734633766393539346330366133626362 Dec 16 12:15:41.907000 audit: BPF prog-id=152 op=UNLOAD Dec 16 12:15:41.907000 audit[3318]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3308 pid=3318 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:41.907000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3131633839626565383462323734633766393539346330366133626362 Dec 16 12:15:41.907000 audit: BPF prog-id=153 op=LOAD Dec 16 12:15:41.907000 audit[3318]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b03e8 a2=98 a3=0 items=0 ppid=3308 pid=3318 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:41.907000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3131633839626565383462323734633766393539346330366133626362 Dec 16 12:15:41.908000 audit: BPF prog-id=154 op=LOAD Dec 16 12:15:41.908000 audit[3318]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001b0168 a2=98 a3=0 items=0 ppid=3308 pid=3318 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:41.908000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3131633839626565383462323734633766393539346330366133626362 Dec 16 12:15:41.908000 audit: BPF prog-id=154 op=UNLOAD Dec 16 12:15:41.908000 audit[3318]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3308 pid=3318 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:41.908000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3131633839626565383462323734633766393539346330366133626362 Dec 16 12:15:41.908000 audit: BPF prog-id=153 op=UNLOAD Dec 16 12:15:41.908000 audit[3318]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3308 pid=3318 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:41.908000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3131633839626565383462323734633766393539346330366133626362 Dec 16 12:15:41.908000 audit: BPF prog-id=155 op=LOAD Dec 16 12:15:41.908000 audit[3318]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0648 a2=98 a3=0 items=0 ppid=3308 pid=3318 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:41.908000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3131633839626565383462323734633766393539346330366133626362 Dec 16 12:15:41.935028 containerd[1680]: time="2025-12-16T12:15:41.934991263Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-69b4fdfc45-t24v9,Uid:844a40f5-7cb7-4039-b9b2-e01b22e43570,Namespace:calico-system,Attempt:0,} returns sandbox id \"11c89bee84b274c7f9594c06a3bcb6534e8eaf6e6a1cacf97204e8ba13f83cf0\"" Dec 16 12:15:41.936638 containerd[1680]: time="2025-12-16T12:15:41.936610991Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Dec 16 12:15:41.950761 kubelet[2884]: E1216 12:15:41.950713 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.950761 kubelet[2884]: W1216 12:15:41.950740 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.950951 kubelet[2884]: E1216 12:15:41.950761 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.950951 kubelet[2884]: E1216 12:15:41.950944 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.951044 kubelet[2884]: W1216 12:15:41.950990 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.951092 kubelet[2884]: E1216 12:15:41.951044 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.951280 kubelet[2884]: E1216 12:15:41.951262 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.951280 kubelet[2884]: W1216 12:15:41.951275 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.951344 kubelet[2884]: E1216 12:15:41.951286 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.951494 kubelet[2884]: E1216 12:15:41.951480 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.951494 kubelet[2884]: W1216 12:15:41.951492 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.951543 kubelet[2884]: E1216 12:15:41.951501 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.951689 kubelet[2884]: E1216 12:15:41.951674 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.951689 kubelet[2884]: W1216 12:15:41.951688 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.951744 kubelet[2884]: E1216 12:15:41.951697 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.951873 kubelet[2884]: E1216 12:15:41.951860 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.951873 kubelet[2884]: W1216 12:15:41.951870 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.951919 kubelet[2884]: E1216 12:15:41.951886 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.952052 kubelet[2884]: E1216 12:15:41.952038 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.952052 kubelet[2884]: W1216 12:15:41.952050 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.952121 kubelet[2884]: E1216 12:15:41.952068 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.952250 kubelet[2884]: E1216 12:15:41.952236 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.952250 kubelet[2884]: W1216 12:15:41.952248 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.952295 kubelet[2884]: E1216 12:15:41.952256 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.952434 kubelet[2884]: E1216 12:15:41.952421 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.952467 kubelet[2884]: W1216 12:15:41.952433 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.952467 kubelet[2884]: E1216 12:15:41.952442 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.952668 kubelet[2884]: E1216 12:15:41.952635 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.952668 kubelet[2884]: W1216 12:15:41.952646 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.952668 kubelet[2884]: E1216 12:15:41.952654 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.952811 kubelet[2884]: E1216 12:15:41.952800 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.952811 kubelet[2884]: W1216 12:15:41.952810 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.952858 kubelet[2884]: E1216 12:15:41.952819 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.952987 kubelet[2884]: E1216 12:15:41.952973 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.952987 kubelet[2884]: W1216 12:15:41.952985 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.953047 kubelet[2884]: E1216 12:15:41.952994 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.953260 kubelet[2884]: E1216 12:15:41.953244 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.953260 kubelet[2884]: W1216 12:15:41.953258 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.953324 kubelet[2884]: E1216 12:15:41.953268 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.953407 kubelet[2884]: E1216 12:15:41.953394 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.953431 kubelet[2884]: W1216 12:15:41.953414 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.953431 kubelet[2884]: E1216 12:15:41.953423 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.953573 kubelet[2884]: E1216 12:15:41.953562 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.953600 kubelet[2884]: W1216 12:15:41.953573 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.953600 kubelet[2884]: E1216 12:15:41.953580 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.953734 kubelet[2884]: E1216 12:15:41.953724 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.953758 kubelet[2884]: W1216 12:15:41.953734 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.953758 kubelet[2884]: E1216 12:15:41.953743 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.953906 kubelet[2884]: E1216 12:15:41.953895 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.953929 kubelet[2884]: W1216 12:15:41.953906 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.953929 kubelet[2884]: E1216 12:15:41.953914 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.954056 kubelet[2884]: E1216 12:15:41.954046 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.954086 kubelet[2884]: W1216 12:15:41.954056 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.954086 kubelet[2884]: E1216 12:15:41.954064 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.954221 kubelet[2884]: E1216 12:15:41.954206 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.954221 kubelet[2884]: W1216 12:15:41.954220 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.954284 kubelet[2884]: E1216 12:15:41.954228 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.954388 kubelet[2884]: E1216 12:15:41.954375 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.954388 kubelet[2884]: W1216 12:15:41.954386 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.954426 kubelet[2884]: E1216 12:15:41.954393 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.959121 kubelet[2884]: E1216 12:15:41.958996 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.959121 kubelet[2884]: W1216 12:15:41.959019 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.959121 kubelet[2884]: E1216 12:15:41.959037 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.959353 kubelet[2884]: I1216 12:15:41.959216 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/504cf836-455c-42a5-8d68-245e5d4890cf-socket-dir\") pod \"csi-node-driver-2hkqn\" (UID: \"504cf836-455c-42a5-8d68-245e5d4890cf\") " pod="calico-system/csi-node-driver-2hkqn" Dec 16 12:15:41.959475 kubelet[2884]: E1216 12:15:41.959431 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.959475 kubelet[2884]: W1216 12:15:41.959451 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.959475 kubelet[2884]: E1216 12:15:41.959469 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.959720 kubelet[2884]: E1216 12:15:41.959589 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.959720 kubelet[2884]: W1216 12:15:41.959596 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.959720 kubelet[2884]: E1216 12:15:41.959603 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.959854 kubelet[2884]: E1216 12:15:41.959840 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.960163 kubelet[2884]: W1216 12:15:41.960132 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.960527 kubelet[2884]: E1216 12:15:41.960272 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.960527 kubelet[2884]: I1216 12:15:41.960305 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qbzzt\" (UniqueName: \"kubernetes.io/projected/504cf836-455c-42a5-8d68-245e5d4890cf-kube-api-access-qbzzt\") pod \"csi-node-driver-2hkqn\" (UID: \"504cf836-455c-42a5-8d68-245e5d4890cf\") " pod="calico-system/csi-node-driver-2hkqn" Dec 16 12:15:41.960715 kubelet[2884]: E1216 12:15:41.960685 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.960846 kubelet[2884]: W1216 12:15:41.960803 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.960931 kubelet[2884]: E1216 12:15:41.960918 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.961002 kubelet[2884]: I1216 12:15:41.960990 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/504cf836-455c-42a5-8d68-245e5d4890cf-kubelet-dir\") pod \"csi-node-driver-2hkqn\" (UID: \"504cf836-455c-42a5-8d68-245e5d4890cf\") " pod="calico-system/csi-node-driver-2hkqn" Dec 16 12:15:41.961601 kubelet[2884]: E1216 12:15:41.961222 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.961601 kubelet[2884]: W1216 12:15:41.961447 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.961601 kubelet[2884]: E1216 12:15:41.961460 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.961794 kubelet[2884]: E1216 12:15:41.961781 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.961849 kubelet[2884]: W1216 12:15:41.961839 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.961906 kubelet[2884]: E1216 12:15:41.961896 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.962634 kubelet[2884]: E1216 12:15:41.962200 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.962767 kubelet[2884]: W1216 12:15:41.962744 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.962825 kubelet[2884]: E1216 12:15:41.962815 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.963067 kubelet[2884]: E1216 12:15:41.963040 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.963067 kubelet[2884]: W1216 12:15:41.963053 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.963222 kubelet[2884]: E1216 12:15:41.963165 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.963222 kubelet[2884]: I1216 12:15:41.963192 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/504cf836-455c-42a5-8d68-245e5d4890cf-registration-dir\") pod \"csi-node-driver-2hkqn\" (UID: \"504cf836-455c-42a5-8d68-245e5d4890cf\") " pod="calico-system/csi-node-driver-2hkqn" Dec 16 12:15:41.963919 kubelet[2884]: E1216 12:15:41.963792 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.963919 kubelet[2884]: W1216 12:15:41.963808 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.963919 kubelet[2884]: E1216 12:15:41.963820 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.964373 kubelet[2884]: E1216 12:15:41.964270 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.964373 kubelet[2884]: W1216 12:15:41.964285 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.964373 kubelet[2884]: E1216 12:15:41.964300 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.964533 kubelet[2884]: E1216 12:15:41.964521 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.964580 kubelet[2884]: W1216 12:15:41.964570 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.964637 kubelet[2884]: E1216 12:15:41.964627 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.964824 kubelet[2884]: E1216 12:15:41.964812 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.964884 kubelet[2884]: W1216 12:15:41.964872 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.964969 kubelet[2884]: E1216 12:15:41.964956 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.965045 kubelet[2884]: I1216 12:15:41.965033 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/504cf836-455c-42a5-8d68-245e5d4890cf-varrun\") pod \"csi-node-driver-2hkqn\" (UID: \"504cf836-455c-42a5-8d68-245e5d4890cf\") " pod="calico-system/csi-node-driver-2hkqn" Dec 16 12:15:41.965435 kubelet[2884]: E1216 12:15:41.965230 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.965435 kubelet[2884]: W1216 12:15:41.965248 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.965435 kubelet[2884]: E1216 12:15:41.965260 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:41.965435 kubelet[2884]: E1216 12:15:41.965392 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:41.965435 kubelet[2884]: W1216 12:15:41.965400 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:41.965435 kubelet[2884]: E1216 12:15:41.965408 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:42.006042 containerd[1680]: time="2025-12-16T12:15:42.005999305Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-t8f6s,Uid:a546a773-f93e-4e94-979b-b999641f0781,Namespace:calico-system,Attempt:0,}" Dec 16 12:15:42.032169 containerd[1680]: time="2025-12-16T12:15:42.030222668Z" level=info msg="connecting to shim b5a44ff26f01aa68aad09ac398ace646732aff07b1d6d3334760efd81ae3f543" address="unix:///run/containerd/s/46b4aab1dcd2e3d75ce022e3cf996172cada088dd310f91f38aa62000b2e8b87" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:15:42.058345 systemd[1]: Started cri-containerd-b5a44ff26f01aa68aad09ac398ace646732aff07b1d6d3334760efd81ae3f543.scope - libcontainer container b5a44ff26f01aa68aad09ac398ace646732aff07b1d6d3334760efd81ae3f543. Dec 16 12:15:42.066564 kubelet[2884]: E1216 12:15:42.066528 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:42.066821 kubelet[2884]: W1216 12:15:42.066652 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:42.066821 kubelet[2884]: E1216 12:15:42.066678 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:42.066997 kubelet[2884]: E1216 12:15:42.066976 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:42.066997 kubelet[2884]: W1216 12:15:42.066996 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:42.067115 kubelet[2884]: E1216 12:15:42.067016 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:42.067194 kubelet[2884]: E1216 12:15:42.067181 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:42.067194 kubelet[2884]: W1216 12:15:42.067191 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:42.067247 kubelet[2884]: E1216 12:15:42.067205 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:42.067372 kubelet[2884]: E1216 12:15:42.067361 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:42.067372 kubelet[2884]: W1216 12:15:42.067371 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:42.067508 kubelet[2884]: E1216 12:15:42.067383 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:42.068392 kubelet[2884]: E1216 12:15:42.068375 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:42.068392 kubelet[2884]: W1216 12:15:42.068393 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:42.068469 kubelet[2884]: E1216 12:15:42.068413 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:42.068899 kubelet[2884]: E1216 12:15:42.068855 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:42.068899 kubelet[2884]: W1216 12:15:42.068873 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:42.068899 kubelet[2884]: E1216 12:15:42.068886 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:42.069902 kubelet[2884]: E1216 12:15:42.069882 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:42.069902 kubelet[2884]: W1216 12:15:42.069901 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:42.070132 kubelet[2884]: E1216 12:15:42.070105 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:42.070398 kubelet[2884]: E1216 12:15:42.070367 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:42.070398 kubelet[2884]: W1216 12:15:42.070384 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:42.070473 kubelet[2884]: E1216 12:15:42.070426 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:42.070866 kubelet[2884]: E1216 12:15:42.070847 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:42.070866 kubelet[2884]: W1216 12:15:42.070863 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:42.070998 kubelet[2884]: E1216 12:15:42.070876 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:42.071067 kubelet[2884]: E1216 12:15:42.071045 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:42.071067 kubelet[2884]: W1216 12:15:42.071064 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:42.071138 kubelet[2884]: E1216 12:15:42.071092 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:42.071478 kubelet[2884]: E1216 12:15:42.071282 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:42.071478 kubelet[2884]: W1216 12:15:42.071294 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:42.071478 kubelet[2884]: E1216 12:15:42.071335 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:42.071780 kubelet[2884]: E1216 12:15:42.071688 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:42.071780 kubelet[2884]: W1216 12:15:42.071702 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:42.071780 kubelet[2884]: E1216 12:15:42.071743 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:42.072772 kubelet[2884]: E1216 12:15:42.072739 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:42.072772 kubelet[2884]: W1216 12:15:42.072756 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:42.073005 kubelet[2884]: E1216 12:15:42.072947 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:42.073872 kubelet[2884]: E1216 12:15:42.073108 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:42.073872 kubelet[2884]: W1216 12:15:42.073121 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:42.073872 kubelet[2884]: E1216 12:15:42.073148 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:42.073872 kubelet[2884]: E1216 12:15:42.073382 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:42.073872 kubelet[2884]: W1216 12:15:42.073391 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:42.073872 kubelet[2884]: E1216 12:15:42.073418 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:42.073872 kubelet[2884]: E1216 12:15:42.073547 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:42.073872 kubelet[2884]: W1216 12:15:42.073554 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:42.073872 kubelet[2884]: E1216 12:15:42.073574 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:42.073872 kubelet[2884]: E1216 12:15:42.073708 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:42.074161 kubelet[2884]: W1216 12:15:42.073716 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:42.074161 kubelet[2884]: E1216 12:15:42.073731 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:42.074161 kubelet[2884]: E1216 12:15:42.073871 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:42.074161 kubelet[2884]: W1216 12:15:42.073879 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:42.074161 kubelet[2884]: E1216 12:15:42.073892 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:42.074161 kubelet[2884]: E1216 12:15:42.074137 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:42.074161 kubelet[2884]: W1216 12:15:42.074146 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:42.074161 kubelet[2884]: E1216 12:15:42.074164 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:42.073000 audit: BPF prog-id=156 op=LOAD Dec 16 12:15:42.074611 kubelet[2884]: E1216 12:15:42.074320 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:42.074611 kubelet[2884]: W1216 12:15:42.074331 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:42.074611 kubelet[2884]: E1216 12:15:42.074371 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:42.074686 kubelet[2884]: E1216 12:15:42.074646 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:42.074686 kubelet[2884]: W1216 12:15:42.074661 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:42.074686 kubelet[2884]: E1216 12:15:42.074677 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:42.074927 kubelet[2884]: E1216 12:15:42.074903 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:42.074927 kubelet[2884]: W1216 12:15:42.074926 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:42.074980 kubelet[2884]: E1216 12:15:42.074945 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:42.074000 audit: BPF prog-id=157 op=LOAD Dec 16 12:15:42.074000 audit[3415]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3403 pid=3415 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:42.074000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235613434666632366630316161363861616430396163333938616365 Dec 16 12:15:42.074000 audit: BPF prog-id=157 op=UNLOAD Dec 16 12:15:42.074000 audit[3415]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3403 pid=3415 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:42.074000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235613434666632366630316161363861616430396163333938616365 Dec 16 12:15:42.075405 kubelet[2884]: E1216 12:15:42.075296 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:42.075405 kubelet[2884]: W1216 12:15:42.075308 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:42.075405 kubelet[2884]: E1216 12:15:42.075326 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:42.074000 audit: BPF prog-id=158 op=LOAD Dec 16 12:15:42.074000 audit[3415]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3403 pid=3415 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:42.074000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235613434666632366630316161363861616430396163333938616365 Dec 16 12:15:42.074000 audit: BPF prog-id=159 op=LOAD Dec 16 12:15:42.074000 audit[3415]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3403 pid=3415 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:42.074000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235613434666632366630316161363861616430396163333938616365 Dec 16 12:15:42.074000 audit: BPF prog-id=159 op=UNLOAD Dec 16 12:15:42.074000 audit[3415]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3403 pid=3415 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:42.074000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235613434666632366630316161363861616430396163333938616365 Dec 16 12:15:42.074000 audit: BPF prog-id=158 op=UNLOAD Dec 16 12:15:42.074000 audit[3415]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3403 pid=3415 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:42.074000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235613434666632366630316161363861616430396163333938616365 Dec 16 12:15:42.074000 audit: BPF prog-id=160 op=LOAD Dec 16 12:15:42.074000 audit[3415]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3403 pid=3415 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:42.074000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6235613434666632366630316161363861616430396163333938616365 Dec 16 12:15:42.076112 kubelet[2884]: E1216 12:15:42.076094 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:42.076145 kubelet[2884]: W1216 12:15:42.076121 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:42.076176 kubelet[2884]: E1216 12:15:42.076142 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:42.076387 kubelet[2884]: E1216 12:15:42.076369 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:42.076387 kubelet[2884]: W1216 12:15:42.076384 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:42.076437 kubelet[2884]: E1216 12:15:42.076395 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:42.081225 kubelet[2884]: E1216 12:15:42.081167 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:42.081225 kubelet[2884]: W1216 12:15:42.081186 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:42.081330 kubelet[2884]: E1216 12:15:42.081241 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:42.090267 containerd[1680]: time="2025-12-16T12:15:42.090228974Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-t8f6s,Uid:a546a773-f93e-4e94-979b-b999641f0781,Namespace:calico-system,Attempt:0,} returns sandbox id \"b5a44ff26f01aa68aad09ac398ace646732aff07b1d6d3334760efd81ae3f543\"" Dec 16 12:15:42.538000 audit[3469]: NETFILTER_CFG table=filter:117 family=2 entries=22 op=nft_register_rule pid=3469 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:42.538000 audit[3469]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffca2000c0 a2=0 a3=1 items=0 ppid=2997 pid=3469 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:42.538000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:42.547000 audit[3469]: NETFILTER_CFG table=nat:118 family=2 entries=12 op=nft_register_rule pid=3469 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:42.547000 audit[3469]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffca2000c0 a2=0 a3=1 items=0 ppid=2997 pid=3469 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:42.547000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:43.445857 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount30389369.mount: Deactivated successfully. Dec 16 12:15:43.648281 kubelet[2884]: E1216 12:15:43.648201 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2hkqn" podUID="504cf836-455c-42a5-8d68-245e5d4890cf" Dec 16 12:15:44.310854 containerd[1680]: time="2025-12-16T12:15:44.310765139Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:15:44.312153 containerd[1680]: time="2025-12-16T12:15:44.312095826Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=31716861" Dec 16 12:15:44.314099 containerd[1680]: time="2025-12-16T12:15:44.313809634Z" level=info msg="ImageCreate event name:\"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:15:44.316927 containerd[1680]: time="2025-12-16T12:15:44.316866770Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:15:44.317497 containerd[1680]: time="2025-12-16T12:15:44.317450773Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"33090541\" in 2.380799382s" Dec 16 12:15:44.317497 containerd[1680]: time="2025-12-16T12:15:44.317495133Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\"" Dec 16 12:15:44.318854 containerd[1680]: time="2025-12-16T12:15:44.318795180Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Dec 16 12:15:44.327236 containerd[1680]: time="2025-12-16T12:15:44.327185982Z" level=info msg="CreateContainer within sandbox \"11c89bee84b274c7f9594c06a3bcb6534e8eaf6e6a1cacf97204e8ba13f83cf0\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Dec 16 12:15:44.337829 containerd[1680]: time="2025-12-16T12:15:44.337769276Z" level=info msg="Container f5ff8092e74a64b37378c24a024a9826bb876f90cf3d7c18fbdbd950225f6b78: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:15:44.340375 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4084216239.mount: Deactivated successfully. Dec 16 12:15:44.347012 containerd[1680]: time="2025-12-16T12:15:44.346954483Z" level=info msg="CreateContainer within sandbox \"11c89bee84b274c7f9594c06a3bcb6534e8eaf6e6a1cacf97204e8ba13f83cf0\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"f5ff8092e74a64b37378c24a024a9826bb876f90cf3d7c18fbdbd950225f6b78\"" Dec 16 12:15:44.347498 containerd[1680]: time="2025-12-16T12:15:44.347449126Z" level=info msg="StartContainer for \"f5ff8092e74a64b37378c24a024a9826bb876f90cf3d7c18fbdbd950225f6b78\"" Dec 16 12:15:44.348705 containerd[1680]: time="2025-12-16T12:15:44.348667492Z" level=info msg="connecting to shim f5ff8092e74a64b37378c24a024a9826bb876f90cf3d7c18fbdbd950225f6b78" address="unix:///run/containerd/s/e44db5688cbfa9a7ba8b76b0c9227f9f3e87e3c0102c579a4ca4f7e7afb2feff" protocol=ttrpc version=3 Dec 16 12:15:44.376395 systemd[1]: Started cri-containerd-f5ff8092e74a64b37378c24a024a9826bb876f90cf3d7c18fbdbd950225f6b78.scope - libcontainer container f5ff8092e74a64b37378c24a024a9826bb876f90cf3d7c18fbdbd950225f6b78. Dec 16 12:15:44.387000 audit: BPF prog-id=161 op=LOAD Dec 16 12:15:44.387000 audit: BPF prog-id=162 op=LOAD Dec 16 12:15:44.387000 audit[3480]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3308 pid=3480 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:44.387000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635666638303932653734613634623337333738633234613032346139 Dec 16 12:15:44.387000 audit: BPF prog-id=162 op=UNLOAD Dec 16 12:15:44.387000 audit[3480]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3308 pid=3480 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:44.387000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635666638303932653734613634623337333738633234613032346139 Dec 16 12:15:44.387000 audit: BPF prog-id=163 op=LOAD Dec 16 12:15:44.387000 audit[3480]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3308 pid=3480 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:44.387000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635666638303932653734613634623337333738633234613032346139 Dec 16 12:15:44.388000 audit: BPF prog-id=164 op=LOAD Dec 16 12:15:44.388000 audit[3480]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3308 pid=3480 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:44.388000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635666638303932653734613634623337333738633234613032346139 Dec 16 12:15:44.388000 audit: BPF prog-id=164 op=UNLOAD Dec 16 12:15:44.388000 audit[3480]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3308 pid=3480 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:44.388000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635666638303932653734613634623337333738633234613032346139 Dec 16 12:15:44.388000 audit: BPF prog-id=163 op=UNLOAD Dec 16 12:15:44.388000 audit[3480]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3308 pid=3480 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:44.388000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635666638303932653734613634623337333738633234613032346139 Dec 16 12:15:44.388000 audit: BPF prog-id=165 op=LOAD Dec 16 12:15:44.388000 audit[3480]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3308 pid=3480 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:44.388000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6635666638303932653734613634623337333738633234613032346139 Dec 16 12:15:44.413616 containerd[1680]: time="2025-12-16T12:15:44.413576543Z" level=info msg="StartContainer for \"f5ff8092e74a64b37378c24a024a9826bb876f90cf3d7c18fbdbd950225f6b78\" returns successfully" Dec 16 12:15:44.768720 kubelet[2884]: E1216 12:15:44.768679 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.768720 kubelet[2884]: W1216 12:15:44.768712 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.768720 kubelet[2884]: E1216 12:15:44.768733 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.769237 kubelet[2884]: E1216 12:15:44.768928 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.769237 kubelet[2884]: W1216 12:15:44.768935 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.769237 kubelet[2884]: E1216 12:15:44.768975 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.769237 kubelet[2884]: E1216 12:15:44.769129 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.769237 kubelet[2884]: W1216 12:15:44.769136 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.769237 kubelet[2884]: E1216 12:15:44.769144 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.769389 kubelet[2884]: E1216 12:15:44.769284 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.769389 kubelet[2884]: W1216 12:15:44.769304 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.769389 kubelet[2884]: E1216 12:15:44.769312 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.769486 kubelet[2884]: E1216 12:15:44.769475 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.769486 kubelet[2884]: W1216 12:15:44.769485 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.769537 kubelet[2884]: E1216 12:15:44.769493 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.769627 kubelet[2884]: E1216 12:15:44.769617 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.769627 kubelet[2884]: W1216 12:15:44.769626 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.769675 kubelet[2884]: E1216 12:15:44.769634 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.769768 kubelet[2884]: E1216 12:15:44.769759 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.769768 kubelet[2884]: W1216 12:15:44.769768 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.769825 kubelet[2884]: E1216 12:15:44.769776 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.769893 kubelet[2884]: E1216 12:15:44.769884 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.769893 kubelet[2884]: W1216 12:15:44.769893 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.769944 kubelet[2884]: E1216 12:15:44.769902 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.770040 kubelet[2884]: E1216 12:15:44.770030 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.770065 kubelet[2884]: W1216 12:15:44.770040 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.770065 kubelet[2884]: E1216 12:15:44.770047 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.770190 kubelet[2884]: E1216 12:15:44.770181 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.770190 kubelet[2884]: W1216 12:15:44.770190 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.770240 kubelet[2884]: E1216 12:15:44.770198 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.770327 kubelet[2884]: E1216 12:15:44.770318 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.770355 kubelet[2884]: W1216 12:15:44.770327 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.770355 kubelet[2884]: E1216 12:15:44.770336 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.770460 kubelet[2884]: E1216 12:15:44.770451 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.770460 kubelet[2884]: W1216 12:15:44.770460 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.770506 kubelet[2884]: E1216 12:15:44.770468 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.770601 kubelet[2884]: E1216 12:15:44.770591 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.770601 kubelet[2884]: W1216 12:15:44.770600 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.770663 kubelet[2884]: E1216 12:15:44.770608 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.770740 kubelet[2884]: E1216 12:15:44.770731 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.770740 kubelet[2884]: W1216 12:15:44.770740 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.770791 kubelet[2884]: E1216 12:15:44.770747 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.770879 kubelet[2884]: E1216 12:15:44.770868 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.770905 kubelet[2884]: W1216 12:15:44.770879 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.770905 kubelet[2884]: E1216 12:15:44.770887 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.790499 kubelet[2884]: E1216 12:15:44.790399 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.790499 kubelet[2884]: W1216 12:15:44.790423 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.790499 kubelet[2884]: E1216 12:15:44.790442 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.790688 kubelet[2884]: E1216 12:15:44.790672 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.790688 kubelet[2884]: W1216 12:15:44.790683 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.790747 kubelet[2884]: E1216 12:15:44.790697 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.790881 kubelet[2884]: E1216 12:15:44.790870 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.790881 kubelet[2884]: W1216 12:15:44.790881 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.790932 kubelet[2884]: E1216 12:15:44.790896 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.791134 kubelet[2884]: E1216 12:15:44.791120 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.791134 kubelet[2884]: W1216 12:15:44.791131 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.791209 kubelet[2884]: E1216 12:15:44.791146 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.791330 kubelet[2884]: E1216 12:15:44.791295 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.791330 kubelet[2884]: W1216 12:15:44.791306 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.791330 kubelet[2884]: E1216 12:15:44.791317 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.791451 kubelet[2884]: E1216 12:15:44.791440 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.791451 kubelet[2884]: W1216 12:15:44.791449 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.791638 kubelet[2884]: E1216 12:15:44.791462 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.791730 kubelet[2884]: E1216 12:15:44.791712 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.791781 kubelet[2884]: W1216 12:15:44.791770 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.791841 kubelet[2884]: E1216 12:15:44.791829 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.792019 kubelet[2884]: E1216 12:15:44.792004 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.792019 kubelet[2884]: W1216 12:15:44.792017 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.792129 kubelet[2884]: E1216 12:15:44.792032 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.792210 kubelet[2884]: E1216 12:15:44.792197 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.792210 kubelet[2884]: W1216 12:15:44.792208 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.792260 kubelet[2884]: E1216 12:15:44.792221 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.792357 kubelet[2884]: E1216 12:15:44.792346 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.792357 kubelet[2884]: W1216 12:15:44.792355 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.792425 kubelet[2884]: E1216 12:15:44.792368 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.792533 kubelet[2884]: E1216 12:15:44.792522 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.792566 kubelet[2884]: W1216 12:15:44.792532 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.792566 kubelet[2884]: E1216 12:15:44.792546 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.792784 kubelet[2884]: E1216 12:15:44.792770 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.792856 kubelet[2884]: W1216 12:15:44.792843 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.792938 kubelet[2884]: E1216 12:15:44.792925 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.793133 kubelet[2884]: E1216 12:15:44.793112 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.793133 kubelet[2884]: W1216 12:15:44.793127 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.793192 kubelet[2884]: E1216 12:15:44.793144 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.793306 kubelet[2884]: E1216 12:15:44.793291 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.793306 kubelet[2884]: W1216 12:15:44.793301 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.793374 kubelet[2884]: E1216 12:15:44.793314 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.793480 kubelet[2884]: E1216 12:15:44.793467 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.793480 kubelet[2884]: W1216 12:15:44.793478 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.793537 kubelet[2884]: E1216 12:15:44.793492 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.793744 kubelet[2884]: E1216 12:15:44.793730 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.793815 kubelet[2884]: W1216 12:15:44.793801 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.793872 kubelet[2884]: E1216 12:15:44.793862 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.794102 kubelet[2884]: E1216 12:15:44.794064 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.794168 kubelet[2884]: W1216 12:15:44.794155 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.794225 kubelet[2884]: E1216 12:15:44.794215 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:44.794418 kubelet[2884]: E1216 12:15:44.794406 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:44.794501 kubelet[2884]: W1216 12:15:44.794487 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:44.794559 kubelet[2884]: E1216 12:15:44.794547 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.648961 kubelet[2884]: E1216 12:15:45.648879 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2hkqn" podUID="504cf836-455c-42a5-8d68-245e5d4890cf" Dec 16 12:15:45.723529 kubelet[2884]: I1216 12:15:45.723465 2884 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 12:15:45.733702 containerd[1680]: time="2025-12-16T12:15:45.733645755Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:15:45.734502 containerd[1680]: time="2025-12-16T12:15:45.734456679Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Dec 16 12:15:45.736479 containerd[1680]: time="2025-12-16T12:15:45.736314649Z" level=info msg="ImageCreate event name:\"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:15:45.738693 containerd[1680]: time="2025-12-16T12:15:45.738651381Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:15:45.739408 containerd[1680]: time="2025-12-16T12:15:45.739366104Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5636392\" in 1.420534044s" Dec 16 12:15:45.739408 containerd[1680]: time="2025-12-16T12:15:45.739397944Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\"" Dec 16 12:15:45.743153 containerd[1680]: time="2025-12-16T12:15:45.743107843Z" level=info msg="CreateContainer within sandbox \"b5a44ff26f01aa68aad09ac398ace646732aff07b1d6d3334760efd81ae3f543\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Dec 16 12:15:45.752576 containerd[1680]: time="2025-12-16T12:15:45.752242690Z" level=info msg="Container 3e3c45810b2dc53ad4cabc02548276a28d7a907b95b2aea8f8531576eaf83e72: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:15:45.766218 containerd[1680]: time="2025-12-16T12:15:45.766134401Z" level=info msg="CreateContainer within sandbox \"b5a44ff26f01aa68aad09ac398ace646732aff07b1d6d3334760efd81ae3f543\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"3e3c45810b2dc53ad4cabc02548276a28d7a907b95b2aea8f8531576eaf83e72\"" Dec 16 12:15:45.768065 containerd[1680]: time="2025-12-16T12:15:45.768036250Z" level=info msg="StartContainer for \"3e3c45810b2dc53ad4cabc02548276a28d7a907b95b2aea8f8531576eaf83e72\"" Dec 16 12:15:45.770364 containerd[1680]: time="2025-12-16T12:15:45.770331902Z" level=info msg="connecting to shim 3e3c45810b2dc53ad4cabc02548276a28d7a907b95b2aea8f8531576eaf83e72" address="unix:///run/containerd/s/46b4aab1dcd2e3d75ce022e3cf996172cada088dd310f91f38aa62000b2e8b87" protocol=ttrpc version=3 Dec 16 12:15:45.776858 kubelet[2884]: E1216 12:15:45.776828 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.776858 kubelet[2884]: W1216 12:15:45.776851 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.777215 kubelet[2884]: E1216 12:15:45.776886 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.777215 kubelet[2884]: E1216 12:15:45.777130 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.777215 kubelet[2884]: W1216 12:15:45.777139 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.777215 kubelet[2884]: E1216 12:15:45.777149 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.777323 kubelet[2884]: E1216 12:15:45.777282 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.777323 kubelet[2884]: W1216 12:15:45.777290 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.777323 kubelet[2884]: E1216 12:15:45.777317 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.777523 kubelet[2884]: E1216 12:15:45.777509 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.777523 kubelet[2884]: W1216 12:15:45.777521 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.777567 kubelet[2884]: E1216 12:15:45.777539 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.777785 kubelet[2884]: E1216 12:15:45.777770 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.777807 kubelet[2884]: W1216 12:15:45.777793 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.777807 kubelet[2884]: E1216 12:15:45.777804 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.777979 kubelet[2884]: E1216 12:15:45.777966 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.778003 kubelet[2884]: W1216 12:15:45.777978 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.778003 kubelet[2884]: E1216 12:15:45.777987 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.778299 kubelet[2884]: E1216 12:15:45.778281 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.778299 kubelet[2884]: W1216 12:15:45.778294 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.778299 kubelet[2884]: E1216 12:15:45.778303 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.778637 kubelet[2884]: E1216 12:15:45.778620 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.778637 kubelet[2884]: W1216 12:15:45.778634 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.778682 kubelet[2884]: E1216 12:15:45.778644 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.780106 kubelet[2884]: E1216 12:15:45.779862 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.780106 kubelet[2884]: W1216 12:15:45.779879 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.780106 kubelet[2884]: E1216 12:15:45.779890 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.780236 kubelet[2884]: E1216 12:15:45.780127 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.780236 kubelet[2884]: W1216 12:15:45.780137 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.780236 kubelet[2884]: E1216 12:15:45.780146 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.780330 kubelet[2884]: E1216 12:15:45.780310 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.780330 kubelet[2884]: W1216 12:15:45.780322 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.780330 kubelet[2884]: E1216 12:15:45.780330 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.780525 kubelet[2884]: E1216 12:15:45.780509 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.780525 kubelet[2884]: W1216 12:15:45.780521 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.780575 kubelet[2884]: E1216 12:15:45.780530 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.780892 kubelet[2884]: E1216 12:15:45.780872 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.780940 kubelet[2884]: W1216 12:15:45.780896 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.780940 kubelet[2884]: E1216 12:15:45.780919 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.781154 kubelet[2884]: E1216 12:15:45.781139 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.781187 kubelet[2884]: W1216 12:15:45.781168 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.781208 kubelet[2884]: E1216 12:15:45.781190 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.781391 kubelet[2884]: E1216 12:15:45.781377 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.781391 kubelet[2884]: W1216 12:15:45.781389 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.781445 kubelet[2884]: E1216 12:15:45.781397 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.786272 systemd[1]: Started cri-containerd-3e3c45810b2dc53ad4cabc02548276a28d7a907b95b2aea8f8531576eaf83e72.scope - libcontainer container 3e3c45810b2dc53ad4cabc02548276a28d7a907b95b2aea8f8531576eaf83e72. Dec 16 12:15:45.798290 kubelet[2884]: E1216 12:15:45.798262 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.798290 kubelet[2884]: W1216 12:15:45.798284 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.798420 kubelet[2884]: E1216 12:15:45.798318 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.798559 kubelet[2884]: E1216 12:15:45.798548 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.798559 kubelet[2884]: W1216 12:15:45.798559 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.798639 kubelet[2884]: E1216 12:15:45.798574 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.798775 kubelet[2884]: E1216 12:15:45.798761 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.798775 kubelet[2884]: W1216 12:15:45.798774 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.798818 kubelet[2884]: E1216 12:15:45.798789 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.798947 kubelet[2884]: E1216 12:15:45.798927 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.798947 kubelet[2884]: W1216 12:15:45.798945 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.799006 kubelet[2884]: E1216 12:15:45.798957 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.799295 kubelet[2884]: E1216 12:15:45.799277 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.799295 kubelet[2884]: W1216 12:15:45.799289 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.799354 kubelet[2884]: E1216 12:15:45.799303 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.799494 kubelet[2884]: E1216 12:15:45.799476 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.799494 kubelet[2884]: W1216 12:15:45.799486 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.799554 kubelet[2884]: E1216 12:15:45.799503 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.799753 kubelet[2884]: E1216 12:15:45.799734 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.799783 kubelet[2884]: W1216 12:15:45.799752 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.799783 kubelet[2884]: E1216 12:15:45.799769 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.799943 kubelet[2884]: E1216 12:15:45.799922 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.799943 kubelet[2884]: W1216 12:15:45.799942 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.799999 kubelet[2884]: E1216 12:15:45.799957 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.800799 kubelet[2884]: E1216 12:15:45.800767 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.800799 kubelet[2884]: W1216 12:15:45.800793 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.800919 kubelet[2884]: E1216 12:15:45.800810 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.801046 kubelet[2884]: E1216 12:15:45.801021 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.801046 kubelet[2884]: W1216 12:15:45.801035 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.801123 kubelet[2884]: E1216 12:15:45.801099 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.801289 kubelet[2884]: E1216 12:15:45.801271 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.801289 kubelet[2884]: W1216 12:15:45.801283 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.801447 kubelet[2884]: E1216 12:15:45.801349 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.801589 kubelet[2884]: E1216 12:15:45.801528 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.801589 kubelet[2884]: W1216 12:15:45.801540 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.801589 kubelet[2884]: E1216 12:15:45.801581 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.801796 kubelet[2884]: E1216 12:15:45.801760 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.801796 kubelet[2884]: W1216 12:15:45.801769 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.801796 kubelet[2884]: E1216 12:15:45.801782 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.802249 kubelet[2884]: E1216 12:15:45.802224 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.802249 kubelet[2884]: W1216 12:15:45.802239 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.802366 kubelet[2884]: E1216 12:15:45.802249 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.802545 kubelet[2884]: E1216 12:15:45.802528 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.802545 kubelet[2884]: W1216 12:15:45.802542 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.802701 kubelet[2884]: E1216 12:15:45.802560 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.802838 kubelet[2884]: E1216 12:15:45.802754 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.802838 kubelet[2884]: W1216 12:15:45.802765 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.802838 kubelet[2884]: E1216 12:15:45.802774 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.803006 kubelet[2884]: E1216 12:15:45.802940 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.803006 kubelet[2884]: W1216 12:15:45.802951 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.803006 kubelet[2884]: E1216 12:15:45.802976 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.803358 kubelet[2884]: E1216 12:15:45.803297 2884 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:15:45.803358 kubelet[2884]: W1216 12:15:45.803310 2884 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:15:45.803358 kubelet[2884]: E1216 12:15:45.803319 2884 plugins.go:695] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:15:45.848000 audit: BPF prog-id=166 op=LOAD Dec 16 12:15:45.848000 audit[3559]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=3403 pid=3559 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:45.848000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3365336334353831306232646335336164346361626330323534383237 Dec 16 12:15:45.848000 audit: BPF prog-id=167 op=LOAD Dec 16 12:15:45.848000 audit[3559]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=40001a0168 a2=98 a3=0 items=0 ppid=3403 pid=3559 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:45.848000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3365336334353831306232646335336164346361626330323534383237 Dec 16 12:15:45.848000 audit: BPF prog-id=167 op=UNLOAD Dec 16 12:15:45.848000 audit[3559]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3403 pid=3559 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:45.848000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3365336334353831306232646335336164346361626330323534383237 Dec 16 12:15:45.848000 audit: BPF prog-id=166 op=UNLOAD Dec 16 12:15:45.848000 audit[3559]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3403 pid=3559 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:45.848000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3365336334353831306232646335336164346361626330323534383237 Dec 16 12:15:45.848000 audit: BPF prog-id=168 op=LOAD Dec 16 12:15:45.848000 audit[3559]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001a0648 a2=98 a3=0 items=0 ppid=3403 pid=3559 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:45.848000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3365336334353831306232646335336164346361626330323534383237 Dec 16 12:15:45.868670 containerd[1680]: time="2025-12-16T12:15:45.868495883Z" level=info msg="StartContainer for \"3e3c45810b2dc53ad4cabc02548276a28d7a907b95b2aea8f8531576eaf83e72\" returns successfully" Dec 16 12:15:45.885465 systemd[1]: cri-containerd-3e3c45810b2dc53ad4cabc02548276a28d7a907b95b2aea8f8531576eaf83e72.scope: Deactivated successfully. Dec 16 12:15:45.888901 containerd[1680]: time="2025-12-16T12:15:45.888857067Z" level=info msg="received container exit event container_id:\"3e3c45810b2dc53ad4cabc02548276a28d7a907b95b2aea8f8531576eaf83e72\" id:\"3e3c45810b2dc53ad4cabc02548276a28d7a907b95b2aea8f8531576eaf83e72\" pid:3587 exited_at:{seconds:1765887345 nanos:888420224}" Dec 16 12:15:45.890000 audit: BPF prog-id=168 op=UNLOAD Dec 16 12:15:45.914041 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-3e3c45810b2dc53ad4cabc02548276a28d7a907b95b2aea8f8531576eaf83e72-rootfs.mount: Deactivated successfully. Dec 16 12:15:47.611416 kubelet[2884]: I1216 12:15:47.611338 2884 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-69b4fdfc45-t24v9" podStartSLOduration=4.229323503 podStartE2EDuration="6.611319651s" podCreationTimestamp="2025-12-16 12:15:41 +0000 UTC" firstStartedPulling="2025-12-16 12:15:41.93639239 +0000 UTC m=+21.555385850" lastFinishedPulling="2025-12-16 12:15:44.318388538 +0000 UTC m=+23.937381998" observedRunningTime="2025-12-16 12:15:44.73656095 +0000 UTC m=+24.355554410" watchObservedRunningTime="2025-12-16 12:15:47.611319651 +0000 UTC m=+27.230313111" Dec 16 12:15:47.648868 kubelet[2884]: E1216 12:15:47.648407 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2hkqn" podUID="504cf836-455c-42a5-8d68-245e5d4890cf" Dec 16 12:15:49.649052 kubelet[2884]: E1216 12:15:49.648672 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2hkqn" podUID="504cf836-455c-42a5-8d68-245e5d4890cf" Dec 16 12:15:50.736432 containerd[1680]: time="2025-12-16T12:15:50.736366588Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Dec 16 12:15:51.648426 kubelet[2884]: E1216 12:15:51.648336 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2hkqn" podUID="504cf836-455c-42a5-8d68-245e5d4890cf" Dec 16 12:15:53.649175 kubelet[2884]: E1216 12:15:53.649060 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-2hkqn" podUID="504cf836-455c-42a5-8d68-245e5d4890cf" Dec 16 12:15:53.834243 containerd[1680]: time="2025-12-16T12:15:53.834196666Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:15:53.835346 containerd[1680]: time="2025-12-16T12:15:53.835303432Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=65921248" Dec 16 12:15:53.836475 containerd[1680]: time="2025-12-16T12:15:53.836414718Z" level=info msg="ImageCreate event name:\"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:15:53.838633 containerd[1680]: time="2025-12-16T12:15:53.838586209Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:15:53.839218 containerd[1680]: time="2025-12-16T12:15:53.839186012Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"67295507\" in 3.102673983s" Dec 16 12:15:53.839264 containerd[1680]: time="2025-12-16T12:15:53.839225092Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\"" Dec 16 12:15:53.841505 containerd[1680]: time="2025-12-16T12:15:53.841441503Z" level=info msg="CreateContainer within sandbox \"b5a44ff26f01aa68aad09ac398ace646732aff07b1d6d3334760efd81ae3f543\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Dec 16 12:15:53.850104 containerd[1680]: time="2025-12-16T12:15:53.849522504Z" level=info msg="Container de845d85e1efd8c11ed9f0712ba10c8d7ec8eaf2c0b1c7d5401267e547e64a29: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:15:53.859449 containerd[1680]: time="2025-12-16T12:15:53.859266994Z" level=info msg="CreateContainer within sandbox \"b5a44ff26f01aa68aad09ac398ace646732aff07b1d6d3334760efd81ae3f543\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"de845d85e1efd8c11ed9f0712ba10c8d7ec8eaf2c0b1c7d5401267e547e64a29\"" Dec 16 12:15:53.861435 containerd[1680]: time="2025-12-16T12:15:53.860749562Z" level=info msg="StartContainer for \"de845d85e1efd8c11ed9f0712ba10c8d7ec8eaf2c0b1c7d5401267e547e64a29\"" Dec 16 12:15:53.862646 containerd[1680]: time="2025-12-16T12:15:53.862586771Z" level=info msg="connecting to shim de845d85e1efd8c11ed9f0712ba10c8d7ec8eaf2c0b1c7d5401267e547e64a29" address="unix:///run/containerd/s/46b4aab1dcd2e3d75ce022e3cf996172cada088dd310f91f38aa62000b2e8b87" protocol=ttrpc version=3 Dec 16 12:15:53.888449 systemd[1]: Started cri-containerd-de845d85e1efd8c11ed9f0712ba10c8d7ec8eaf2c0b1c7d5401267e547e64a29.scope - libcontainer container de845d85e1efd8c11ed9f0712ba10c8d7ec8eaf2c0b1c7d5401267e547e64a29. Dec 16 12:15:53.942000 audit: BPF prog-id=169 op=LOAD Dec 16 12:15:53.944248 kernel: kauditd_printk_skb: 90 callbacks suppressed Dec 16 12:15:53.944297 kernel: audit: type=1334 audit(1765887353.942:569): prog-id=169 op=LOAD Dec 16 12:15:53.942000 audit[3657]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3403 pid=3657 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:53.947831 kernel: audit: type=1300 audit(1765887353.942:569): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3403 pid=3657 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:53.947917 kernel: audit: type=1327 audit(1765887353.942:569): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465383435643835653165666438633131656439663037313262613130 Dec 16 12:15:53.942000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465383435643835653165666438633131656439663037313262613130 Dec 16 12:15:53.942000 audit: BPF prog-id=170 op=LOAD Dec 16 12:15:53.951375 kernel: audit: type=1334 audit(1765887353.942:570): prog-id=170 op=LOAD Dec 16 12:15:53.951417 kernel: audit: type=1300 audit(1765887353.942:570): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3403 pid=3657 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:53.942000 audit[3657]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3403 pid=3657 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:53.942000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465383435643835653165666438633131656439663037313262613130 Dec 16 12:15:53.957161 kernel: audit: type=1327 audit(1765887353.942:570): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465383435643835653165666438633131656439663037313262613130 Dec 16 12:15:53.943000 audit: BPF prog-id=170 op=UNLOAD Dec 16 12:15:53.958482 kernel: audit: type=1334 audit(1765887353.943:571): prog-id=170 op=UNLOAD Dec 16 12:15:53.958552 kernel: audit: type=1300 audit(1765887353.943:571): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3403 pid=3657 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:53.943000 audit[3657]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3403 pid=3657 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:53.943000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465383435643835653165666438633131656439663037313262613130 Dec 16 12:15:53.964379 kernel: audit: type=1327 audit(1765887353.943:571): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465383435643835653165666438633131656439663037313262613130 Dec 16 12:15:53.943000 audit: BPF prog-id=169 op=UNLOAD Dec 16 12:15:53.965448 kernel: audit: type=1334 audit(1765887353.943:572): prog-id=169 op=UNLOAD Dec 16 12:15:53.943000 audit[3657]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3403 pid=3657 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:53.943000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465383435643835653165666438633131656439663037313262613130 Dec 16 12:15:53.943000 audit: BPF prog-id=171 op=LOAD Dec 16 12:15:53.943000 audit[3657]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3403 pid=3657 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:53.943000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6465383435643835653165666438633131656439663037313262613130 Dec 16 12:15:53.983436 containerd[1680]: time="2025-12-16T12:15:53.983363427Z" level=info msg="StartContainer for \"de845d85e1efd8c11ed9f0712ba10c8d7ec8eaf2c0b1c7d5401267e547e64a29\" returns successfully" Dec 16 12:15:55.244193 containerd[1680]: time="2025-12-16T12:15:55.244142617Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 16 12:15:55.246410 systemd[1]: cri-containerd-de845d85e1efd8c11ed9f0712ba10c8d7ec8eaf2c0b1c7d5401267e547e64a29.scope: Deactivated successfully. Dec 16 12:15:55.246723 systemd[1]: cri-containerd-de845d85e1efd8c11ed9f0712ba10c8d7ec8eaf2c0b1c7d5401267e547e64a29.scope: Consumed 451ms CPU time, 189.4M memory peak, 165.9M written to disk. Dec 16 12:15:55.248395 containerd[1680]: time="2025-12-16T12:15:55.248363158Z" level=info msg="received container exit event container_id:\"de845d85e1efd8c11ed9f0712ba10c8d7ec8eaf2c0b1c7d5401267e547e64a29\" id:\"de845d85e1efd8c11ed9f0712ba10c8d7ec8eaf2c0b1c7d5401267e547e64a29\" pid:3671 exited_at:{seconds:1765887355 nanos:248118037}" Dec 16 12:15:55.254000 audit: BPF prog-id=171 op=UNLOAD Dec 16 12:15:55.271860 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-de845d85e1efd8c11ed9f0712ba10c8d7ec8eaf2c0b1c7d5401267e547e64a29-rootfs.mount: Deactivated successfully. Dec 16 12:15:55.303006 kubelet[2884]: I1216 12:15:55.302970 2884 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Dec 16 12:15:55.653721 systemd[1]: Created slice kubepods-besteffort-pod504cf836_455c_42a5_8d68_245e5d4890cf.slice - libcontainer container kubepods-besteffort-pod504cf836_455c_42a5_8d68_245e5d4890cf.slice. Dec 16 12:15:55.911724 containerd[1680]: time="2025-12-16T12:15:55.911526900Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2hkqn,Uid:504cf836-455c-42a5-8d68-245e5d4890cf,Namespace:calico-system,Attempt:0,}" Dec 16 12:15:56.593873 systemd[1]: Created slice kubepods-besteffort-pod13366e02_1117_46c3_a880_8d8cc6c423f8.slice - libcontainer container kubepods-besteffort-pod13366e02_1117_46c3_a880_8d8cc6c423f8.slice. Dec 16 12:15:56.606549 systemd[1]: Created slice kubepods-burstable-podf1aba02a_fd77_4f59_8d87_b35648e9b9d3.slice - libcontainer container kubepods-burstable-podf1aba02a_fd77_4f59_8d87_b35648e9b9d3.slice. Dec 16 12:15:56.613975 systemd[1]: Created slice kubepods-besteffort-pod384a06af_c494_40e9_b0a8_31b5c5a33ae4.slice - libcontainer container kubepods-besteffort-pod384a06af_c494_40e9_b0a8_31b5c5a33ae4.slice. Dec 16 12:15:56.621901 systemd[1]: Created slice kubepods-besteffort-pod576d8526_5af6_453c_afc4_7ebd613c4146.slice - libcontainer container kubepods-besteffort-pod576d8526_5af6_453c_afc4_7ebd613c4146.slice. Dec 16 12:15:56.630602 systemd[1]: Created slice kubepods-besteffort-pod3cfa8afe_d370_4d42_b9ea_f53cfd764b71.slice - libcontainer container kubepods-besteffort-pod3cfa8afe_d370_4d42_b9ea_f53cfd764b71.slice. Dec 16 12:15:56.635956 systemd[1]: Created slice kubepods-burstable-pod4f3aa479_c591_4d08_8703_ea7e260802a6.slice - libcontainer container kubepods-burstable-pod4f3aa479_c591_4d08_8703_ea7e260802a6.slice. Dec 16 12:15:56.643044 systemd[1]: Created slice kubepods-besteffort-pod896c3574_3482_4970_a592_5c7752aa620e.slice - libcontainer container kubepods-besteffort-pod896c3574_3482_4970_a592_5c7752aa620e.slice. Dec 16 12:15:56.647825 systemd[1]: Created slice kubepods-besteffort-pod9de1278d_946f_4d27_9201_402be6a50469.slice - libcontainer container kubepods-besteffort-pod9de1278d_946f_4d27_9201_402be6a50469.slice. Dec 16 12:15:56.670824 kubelet[2884]: I1216 12:15:56.670767 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/896c3574-3482-4970-a592-5c7752aa620e-goldmane-ca-bundle\") pod \"goldmane-666569f655-v5bnx\" (UID: \"896c3574-3482-4970-a592-5c7752aa620e\") " pod="calico-system/goldmane-666569f655-v5bnx" Dec 16 12:15:56.670824 kubelet[2884]: I1216 12:15:56.670814 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p6kpv\" (UniqueName: \"kubernetes.io/projected/896c3574-3482-4970-a592-5c7752aa620e-kube-api-access-p6kpv\") pod \"goldmane-666569f655-v5bnx\" (UID: \"896c3574-3482-4970-a592-5c7752aa620e\") " pod="calico-system/goldmane-666569f655-v5bnx" Dec 16 12:15:56.671321 kubelet[2884]: I1216 12:15:56.670836 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l2lmg\" (UniqueName: \"kubernetes.io/projected/13366e02-1117-46c3-a880-8d8cc6c423f8-kube-api-access-l2lmg\") pod \"calico-apiserver-6784c79f67-sbbxz\" (UID: \"13366e02-1117-46c3-a880-8d8cc6c423f8\") " pod="calico-apiserver/calico-apiserver-6784c79f67-sbbxz" Dec 16 12:15:56.671321 kubelet[2884]: I1216 12:15:56.670854 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ptdjf\" (UniqueName: \"kubernetes.io/projected/384a06af-c494-40e9-b0a8-31b5c5a33ae4-kube-api-access-ptdjf\") pod \"calico-kube-controllers-6b9b8c464c-4jgjb\" (UID: \"384a06af-c494-40e9-b0a8-31b5c5a33ae4\") " pod="calico-system/calico-kube-controllers-6b9b8c464c-4jgjb" Dec 16 12:15:56.671321 kubelet[2884]: I1216 12:15:56.670874 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/896c3574-3482-4970-a592-5c7752aa620e-goldmane-key-pair\") pod \"goldmane-666569f655-v5bnx\" (UID: \"896c3574-3482-4970-a592-5c7752aa620e\") " pod="calico-system/goldmane-666569f655-v5bnx" Dec 16 12:15:56.671321 kubelet[2884]: I1216 12:15:56.670891 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9de1278d-946f-4d27-9201-402be6a50469-whisker-ca-bundle\") pod \"whisker-695c59c79-wrg8j\" (UID: \"9de1278d-946f-4d27-9201-402be6a50469\") " pod="calico-system/whisker-695c59c79-wrg8j" Dec 16 12:15:56.671321 kubelet[2884]: I1216 12:15:56.670910 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/3cfa8afe-d370-4d42-b9ea-f53cfd764b71-calico-apiserver-certs\") pod \"calico-apiserver-6784c79f67-5nkhx\" (UID: \"3cfa8afe-d370-4d42-b9ea-f53cfd764b71\") " pod="calico-apiserver/calico-apiserver-6784c79f67-5nkhx" Dec 16 12:15:56.671443 kubelet[2884]: I1216 12:15:56.670926 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/13366e02-1117-46c3-a880-8d8cc6c423f8-calico-apiserver-certs\") pod \"calico-apiserver-6784c79f67-sbbxz\" (UID: \"13366e02-1117-46c3-a880-8d8cc6c423f8\") " pod="calico-apiserver/calico-apiserver-6784c79f67-sbbxz" Dec 16 12:15:56.671443 kubelet[2884]: I1216 12:15:56.670979 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9de1278d-946f-4d27-9201-402be6a50469-whisker-backend-key-pair\") pod \"whisker-695c59c79-wrg8j\" (UID: \"9de1278d-946f-4d27-9201-402be6a50469\") " pod="calico-system/whisker-695c59c79-wrg8j" Dec 16 12:15:56.671443 kubelet[2884]: I1216 12:15:56.671000 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4f3aa479-c591-4d08-8703-ea7e260802a6-config-volume\") pod \"coredns-668d6bf9bc-n6pr2\" (UID: \"4f3aa479-c591-4d08-8703-ea7e260802a6\") " pod="kube-system/coredns-668d6bf9bc-n6pr2" Dec 16 12:15:56.671443 kubelet[2884]: I1216 12:15:56.671160 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9khtg\" (UniqueName: \"kubernetes.io/projected/9de1278d-946f-4d27-9201-402be6a50469-kube-api-access-9khtg\") pod \"whisker-695c59c79-wrg8j\" (UID: \"9de1278d-946f-4d27-9201-402be6a50469\") " pod="calico-system/whisker-695c59c79-wrg8j" Dec 16 12:15:56.671443 kubelet[2884]: I1216 12:15:56.671192 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rh9ns\" (UniqueName: \"kubernetes.io/projected/f1aba02a-fd77-4f59-8d87-b35648e9b9d3-kube-api-access-rh9ns\") pod \"coredns-668d6bf9bc-7w7l8\" (UID: \"f1aba02a-fd77-4f59-8d87-b35648e9b9d3\") " pod="kube-system/coredns-668d6bf9bc-7w7l8" Dec 16 12:15:56.671545 kubelet[2884]: I1216 12:15:56.671241 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wjrmj\" (UniqueName: \"kubernetes.io/projected/3cfa8afe-d370-4d42-b9ea-f53cfd764b71-kube-api-access-wjrmj\") pod \"calico-apiserver-6784c79f67-5nkhx\" (UID: \"3cfa8afe-d370-4d42-b9ea-f53cfd764b71\") " pod="calico-apiserver/calico-apiserver-6784c79f67-5nkhx" Dec 16 12:15:56.671545 kubelet[2884]: I1216 12:15:56.671264 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/896c3574-3482-4970-a592-5c7752aa620e-config\") pod \"goldmane-666569f655-v5bnx\" (UID: \"896c3574-3482-4970-a592-5c7752aa620e\") " pod="calico-system/goldmane-666569f655-v5bnx" Dec 16 12:15:56.671545 kubelet[2884]: I1216 12:15:56.671309 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4pz4\" (UniqueName: \"kubernetes.io/projected/4f3aa479-c591-4d08-8703-ea7e260802a6-kube-api-access-l4pz4\") pod \"coredns-668d6bf9bc-n6pr2\" (UID: \"4f3aa479-c591-4d08-8703-ea7e260802a6\") " pod="kube-system/coredns-668d6bf9bc-n6pr2" Dec 16 12:15:56.671545 kubelet[2884]: I1216 12:15:56.671353 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/f1aba02a-fd77-4f59-8d87-b35648e9b9d3-config-volume\") pod \"coredns-668d6bf9bc-7w7l8\" (UID: \"f1aba02a-fd77-4f59-8d87-b35648e9b9d3\") " pod="kube-system/coredns-668d6bf9bc-7w7l8" Dec 16 12:15:56.671545 kubelet[2884]: I1216 12:15:56.671389 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/576d8526-5af6-453c-afc4-7ebd613c4146-calico-apiserver-certs\") pod \"calico-apiserver-9bb48c66-mjzv6\" (UID: \"576d8526-5af6-453c-afc4-7ebd613c4146\") " pod="calico-apiserver/calico-apiserver-9bb48c66-mjzv6" Dec 16 12:15:56.671648 kubelet[2884]: I1216 12:15:56.671412 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/384a06af-c494-40e9-b0a8-31b5c5a33ae4-tigera-ca-bundle\") pod \"calico-kube-controllers-6b9b8c464c-4jgjb\" (UID: \"384a06af-c494-40e9-b0a8-31b5c5a33ae4\") " pod="calico-system/calico-kube-controllers-6b9b8c464c-4jgjb" Dec 16 12:15:56.671648 kubelet[2884]: I1216 12:15:56.671432 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckh7w\" (UniqueName: \"kubernetes.io/projected/576d8526-5af6-453c-afc4-7ebd613c4146-kube-api-access-ckh7w\") pod \"calico-apiserver-9bb48c66-mjzv6\" (UID: \"576d8526-5af6-453c-afc4-7ebd613c4146\") " pod="calico-apiserver/calico-apiserver-9bb48c66-mjzv6" Dec 16 12:15:56.756804 containerd[1680]: time="2025-12-16T12:15:56.756573010Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Dec 16 12:15:56.807037 containerd[1680]: time="2025-12-16T12:15:56.806969027Z" level=error msg="Failed to destroy network for sandbox \"a3d4b94ebfabef52a9144ad63d877ddbfeb605f220e67569ee5d131bb8e1e6ee\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:15:56.815304 containerd[1680]: time="2025-12-16T12:15:56.815247109Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2hkqn,Uid:504cf836-455c-42a5-8d68-245e5d4890cf,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3d4b94ebfabef52a9144ad63d877ddbfeb605f220e67569ee5d131bb8e1e6ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:15:56.815498 kubelet[2884]: E1216 12:15:56.815460 2884 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3d4b94ebfabef52a9144ad63d877ddbfeb605f220e67569ee5d131bb8e1e6ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:15:56.815557 kubelet[2884]: E1216 12:15:56.815526 2884 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3d4b94ebfabef52a9144ad63d877ddbfeb605f220e67569ee5d131bb8e1e6ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-2hkqn" Dec 16 12:15:56.815557 kubelet[2884]: E1216 12:15:56.815546 2884 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3d4b94ebfabef52a9144ad63d877ddbfeb605f220e67569ee5d131bb8e1e6ee\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-2hkqn" Dec 16 12:15:56.815606 kubelet[2884]: E1216 12:15:56.815585 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-2hkqn_calico-system(504cf836-455c-42a5-8d68-245e5d4890cf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-2hkqn_calico-system(504cf836-455c-42a5-8d68-245e5d4890cf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a3d4b94ebfabef52a9144ad63d877ddbfeb605f220e67569ee5d131bb8e1e6ee\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-2hkqn" podUID="504cf836-455c-42a5-8d68-245e5d4890cf" Dec 16 12:15:56.904262 containerd[1680]: time="2025-12-16T12:15:56.904215883Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6784c79f67-sbbxz,Uid:13366e02-1117-46c3-a880-8d8cc6c423f8,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:15:56.911004 containerd[1680]: time="2025-12-16T12:15:56.910966397Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-7w7l8,Uid:f1aba02a-fd77-4f59-8d87-b35648e9b9d3,Namespace:kube-system,Attempt:0,}" Dec 16 12:15:56.920641 containerd[1680]: time="2025-12-16T12:15:56.920588566Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6b9b8c464c-4jgjb,Uid:384a06af-c494-40e9-b0a8-31b5c5a33ae4,Namespace:calico-system,Attempt:0,}" Dec 16 12:15:56.929361 containerd[1680]: time="2025-12-16T12:15:56.929323531Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9bb48c66-mjzv6,Uid:576d8526-5af6-453c-afc4-7ebd613c4146,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:15:56.935705 containerd[1680]: time="2025-12-16T12:15:56.935637643Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6784c79f67-5nkhx,Uid:3cfa8afe-d370-4d42-b9ea-f53cfd764b71,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:15:56.939939 containerd[1680]: time="2025-12-16T12:15:56.939848625Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-n6pr2,Uid:4f3aa479-c591-4d08-8703-ea7e260802a6,Namespace:kube-system,Attempt:0,}" Dec 16 12:15:56.947201 containerd[1680]: time="2025-12-16T12:15:56.947163302Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-v5bnx,Uid:896c3574-3482-4970-a592-5c7752aa620e,Namespace:calico-system,Attempt:0,}" Dec 16 12:15:56.952206 containerd[1680]: time="2025-12-16T12:15:56.952154967Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-695c59c79-wrg8j,Uid:9de1278d-946f-4d27-9201-402be6a50469,Namespace:calico-system,Attempt:0,}" Dec 16 12:15:56.997710 containerd[1680]: time="2025-12-16T12:15:56.997658279Z" level=error msg="Failed to destroy network for sandbox \"c27e3e8209808c8213dec3a3c2e1a8b37004188e89b8f32c66c7a3d4697383d8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:15:57.003341 containerd[1680]: time="2025-12-16T12:15:57.003278908Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-7w7l8,Uid:f1aba02a-fd77-4f59-8d87-b35648e9b9d3,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"c27e3e8209808c8213dec3a3c2e1a8b37004188e89b8f32c66c7a3d4697383d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:15:57.003971 kubelet[2884]: E1216 12:15:57.003521 2884 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c27e3e8209808c8213dec3a3c2e1a8b37004188e89b8f32c66c7a3d4697383d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:15:57.003971 kubelet[2884]: E1216 12:15:57.003576 2884 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c27e3e8209808c8213dec3a3c2e1a8b37004188e89b8f32c66c7a3d4697383d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-7w7l8" Dec 16 12:15:57.003971 kubelet[2884]: E1216 12:15:57.003603 2884 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"c27e3e8209808c8213dec3a3c2e1a8b37004188e89b8f32c66c7a3d4697383d8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-7w7l8" Dec 16 12:15:57.004125 kubelet[2884]: E1216 12:15:57.003642 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-7w7l8_kube-system(f1aba02a-fd77-4f59-8d87-b35648e9b9d3)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-7w7l8_kube-system(f1aba02a-fd77-4f59-8d87-b35648e9b9d3)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"c27e3e8209808c8213dec3a3c2e1a8b37004188e89b8f32c66c7a3d4697383d8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-7w7l8" podUID="f1aba02a-fd77-4f59-8d87-b35648e9b9d3" Dec 16 12:15:57.007891 containerd[1680]: time="2025-12-16T12:15:57.007846571Z" level=error msg="Failed to destroy network for sandbox \"34499e723cf593a208942a8b67081199eb0bef32e90cd82005827f26caf69024\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:15:57.011514 containerd[1680]: time="2025-12-16T12:15:57.011457310Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6784c79f67-sbbxz,Uid:13366e02-1117-46c3-a880-8d8cc6c423f8,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"34499e723cf593a208942a8b67081199eb0bef32e90cd82005827f26caf69024\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:15:57.012061 kubelet[2884]: E1216 12:15:57.011797 2884 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"34499e723cf593a208942a8b67081199eb0bef32e90cd82005827f26caf69024\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:15:57.012178 kubelet[2884]: E1216 12:15:57.012094 2884 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"34499e723cf593a208942a8b67081199eb0bef32e90cd82005827f26caf69024\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6784c79f67-sbbxz" Dec 16 12:15:57.012178 kubelet[2884]: E1216 12:15:57.012127 2884 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"34499e723cf593a208942a8b67081199eb0bef32e90cd82005827f26caf69024\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6784c79f67-sbbxz" Dec 16 12:15:57.012858 kubelet[2884]: E1216 12:15:57.012172 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6784c79f67-sbbxz_calico-apiserver(13366e02-1117-46c3-a880-8d8cc6c423f8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6784c79f67-sbbxz_calico-apiserver(13366e02-1117-46c3-a880-8d8cc6c423f8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"34499e723cf593a208942a8b67081199eb0bef32e90cd82005827f26caf69024\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6784c79f67-sbbxz" podUID="13366e02-1117-46c3-a880-8d8cc6c423f8" Dec 16 12:15:57.035259 containerd[1680]: time="2025-12-16T12:15:57.035210831Z" level=error msg="Failed to destroy network for sandbox \"3d0d705f10fd5e2bbb41a691b3b6a11b719ff444cdd25058ccee30387442afa6\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:15:57.041703 containerd[1680]: time="2025-12-16T12:15:57.041644624Z" level=error msg="Failed to destroy network for sandbox \"052d4beeea9c98224966bd9e5f8ef5f7ed82fd3e66b37f08463dc46e37a02cb1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:15:57.042576 containerd[1680]: time="2025-12-16T12:15:57.041886065Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6b9b8c464c-4jgjb,Uid:384a06af-c494-40e9-b0a8-31b5c5a33ae4,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3d0d705f10fd5e2bbb41a691b3b6a11b719ff444cdd25058ccee30387442afa6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:15:57.042663 kubelet[2884]: E1216 12:15:57.042138 2884 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3d0d705f10fd5e2bbb41a691b3b6a11b719ff444cdd25058ccee30387442afa6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:15:57.042663 kubelet[2884]: E1216 12:15:57.042192 2884 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3d0d705f10fd5e2bbb41a691b3b6a11b719ff444cdd25058ccee30387442afa6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6b9b8c464c-4jgjb" Dec 16 12:15:57.042663 kubelet[2884]: E1216 12:15:57.042210 2884 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3d0d705f10fd5e2bbb41a691b3b6a11b719ff444cdd25058ccee30387442afa6\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6b9b8c464c-4jgjb" Dec 16 12:15:57.042757 kubelet[2884]: E1216 12:15:57.042244 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6b9b8c464c-4jgjb_calico-system(384a06af-c494-40e9-b0a8-31b5c5a33ae4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6b9b8c464c-4jgjb_calico-system(384a06af-c494-40e9-b0a8-31b5c5a33ae4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3d0d705f10fd5e2bbb41a691b3b6a11b719ff444cdd25058ccee30387442afa6\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6b9b8c464c-4jgjb" podUID="384a06af-c494-40e9-b0a8-31b5c5a33ae4" Dec 16 12:15:57.046253 containerd[1680]: time="2025-12-16T12:15:57.046126487Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9bb48c66-mjzv6,Uid:576d8526-5af6-453c-afc4-7ebd613c4146,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"052d4beeea9c98224966bd9e5f8ef5f7ed82fd3e66b37f08463dc46e37a02cb1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:15:57.046396 kubelet[2884]: E1216 12:15:57.046329 2884 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"052d4beeea9c98224966bd9e5f8ef5f7ed82fd3e66b37f08463dc46e37a02cb1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:15:57.046396 kubelet[2884]: E1216 12:15:57.046375 2884 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"052d4beeea9c98224966bd9e5f8ef5f7ed82fd3e66b37f08463dc46e37a02cb1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-9bb48c66-mjzv6" Dec 16 12:15:57.046396 kubelet[2884]: E1216 12:15:57.046393 2884 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"052d4beeea9c98224966bd9e5f8ef5f7ed82fd3e66b37f08463dc46e37a02cb1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-9bb48c66-mjzv6" Dec 16 12:15:57.046488 kubelet[2884]: E1216 12:15:57.046426 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-9bb48c66-mjzv6_calico-apiserver(576d8526-5af6-453c-afc4-7ebd613c4146)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-9bb48c66-mjzv6_calico-apiserver(576d8526-5af6-453c-afc4-7ebd613c4146)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"052d4beeea9c98224966bd9e5f8ef5f7ed82fd3e66b37f08463dc46e37a02cb1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-9bb48c66-mjzv6" podUID="576d8526-5af6-453c-afc4-7ebd613c4146" Dec 16 12:15:57.056448 containerd[1680]: time="2025-12-16T12:15:57.056387859Z" level=error msg="Failed to destroy network for sandbox \"106ef44abd24463bd0a18be0908aa64baf1993ec35fa2f5376db6cbff97881f9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:15:57.061680 containerd[1680]: time="2025-12-16T12:15:57.061620926Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-v5bnx,Uid:896c3574-3482-4970-a592-5c7752aa620e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"106ef44abd24463bd0a18be0908aa64baf1993ec35fa2f5376db6cbff97881f9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:15:57.061970 kubelet[2884]: E1216 12:15:57.061937 2884 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"106ef44abd24463bd0a18be0908aa64baf1993ec35fa2f5376db6cbff97881f9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:15:57.062552 kubelet[2884]: E1216 12:15:57.062064 2884 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"106ef44abd24463bd0a18be0908aa64baf1993ec35fa2f5376db6cbff97881f9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-v5bnx" Dec 16 12:15:57.062552 kubelet[2884]: E1216 12:15:57.062277 2884 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"106ef44abd24463bd0a18be0908aa64baf1993ec35fa2f5376db6cbff97881f9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-v5bnx" Dec 16 12:15:57.062698 kubelet[2884]: E1216 12:15:57.062656 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-v5bnx_calico-system(896c3574-3482-4970-a592-5c7752aa620e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-v5bnx_calico-system(896c3574-3482-4970-a592-5c7752aa620e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"106ef44abd24463bd0a18be0908aa64baf1993ec35fa2f5376db6cbff97881f9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-v5bnx" podUID="896c3574-3482-4970-a592-5c7752aa620e" Dec 16 12:15:57.064999 containerd[1680]: time="2025-12-16T12:15:57.064941383Z" level=error msg="Failed to destroy network for sandbox \"e3f64581d5e68bea0d0fd8b761071be07746b084cd02115192534f0c7ffd3d73\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:15:57.067825 containerd[1680]: time="2025-12-16T12:15:57.067671637Z" level=error msg="Failed to destroy network for sandbox \"3a4615e6855555192d775a9075a2fd761551b3f25b923d3f510c2808f70a8ba1\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:15:57.068237 containerd[1680]: time="2025-12-16T12:15:57.068148159Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6784c79f67-5nkhx,Uid:3cfa8afe-d370-4d42-b9ea-f53cfd764b71,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3f64581d5e68bea0d0fd8b761071be07746b084cd02115192534f0c7ffd3d73\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:15:57.068445 kubelet[2884]: E1216 12:15:57.068414 2884 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3f64581d5e68bea0d0fd8b761071be07746b084cd02115192534f0c7ffd3d73\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:15:57.068594 kubelet[2884]: E1216 12:15:57.068541 2884 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3f64581d5e68bea0d0fd8b761071be07746b084cd02115192534f0c7ffd3d73\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6784c79f67-5nkhx" Dec 16 12:15:57.069018 kubelet[2884]: E1216 12:15:57.068663 2884 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"e3f64581d5e68bea0d0fd8b761071be07746b084cd02115192534f0c7ffd3d73\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-6784c79f67-5nkhx" Dec 16 12:15:57.069018 kubelet[2884]: E1216 12:15:57.068721 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-6784c79f67-5nkhx_calico-apiserver(3cfa8afe-d370-4d42-b9ea-f53cfd764b71)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-6784c79f67-5nkhx_calico-apiserver(3cfa8afe-d370-4d42-b9ea-f53cfd764b71)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"e3f64581d5e68bea0d0fd8b761071be07746b084cd02115192534f0c7ffd3d73\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-6784c79f67-5nkhx" podUID="3cfa8afe-d370-4d42-b9ea-f53cfd764b71" Dec 16 12:15:57.072209 containerd[1680]: time="2025-12-16T12:15:57.072159019Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-n6pr2,Uid:4f3aa479-c591-4d08-8703-ea7e260802a6,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a4615e6855555192d775a9075a2fd761551b3f25b923d3f510c2808f70a8ba1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:15:57.072890 kubelet[2884]: E1216 12:15:57.072505 2884 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a4615e6855555192d775a9075a2fd761551b3f25b923d3f510c2808f70a8ba1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:15:57.072890 kubelet[2884]: E1216 12:15:57.072552 2884 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a4615e6855555192d775a9075a2fd761551b3f25b923d3f510c2808f70a8ba1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-n6pr2" Dec 16 12:15:57.072890 kubelet[2884]: E1216 12:15:57.072592 2884 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3a4615e6855555192d775a9075a2fd761551b3f25b923d3f510c2808f70a8ba1\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-668d6bf9bc-n6pr2" Dec 16 12:15:57.073017 kubelet[2884]: E1216 12:15:57.072625 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-668d6bf9bc-n6pr2_kube-system(4f3aa479-c591-4d08-8703-ea7e260802a6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-668d6bf9bc-n6pr2_kube-system(4f3aa479-c591-4d08-8703-ea7e260802a6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3a4615e6855555192d775a9075a2fd761551b3f25b923d3f510c2808f70a8ba1\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-668d6bf9bc-n6pr2" podUID="4f3aa479-c591-4d08-8703-ea7e260802a6" Dec 16 12:15:57.076883 containerd[1680]: time="2025-12-16T12:15:57.076845203Z" level=error msg="Failed to destroy network for sandbox \"1a1715fd0017432dabb437fe08558f6c5be8ef79fbaa6d2fa12a60b4b4bdadf9\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:15:57.079592 containerd[1680]: time="2025-12-16T12:15:57.079544057Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-695c59c79-wrg8j,Uid:9de1278d-946f-4d27-9201-402be6a50469,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a1715fd0017432dabb437fe08558f6c5be8ef79fbaa6d2fa12a60b4b4bdadf9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:15:57.079927 kubelet[2884]: E1216 12:15:57.079896 2884 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a1715fd0017432dabb437fe08558f6c5be8ef79fbaa6d2fa12a60b4b4bdadf9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:15:57.079975 kubelet[2884]: E1216 12:15:57.079944 2884 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a1715fd0017432dabb437fe08558f6c5be8ef79fbaa6d2fa12a60b4b4bdadf9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-695c59c79-wrg8j" Dec 16 12:15:57.079975 kubelet[2884]: E1216 12:15:57.079965 2884 kuberuntime_manager.go:1237] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1a1715fd0017432dabb437fe08558f6c5be8ef79fbaa6d2fa12a60b4b4bdadf9\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-695c59c79-wrg8j" Dec 16 12:15:57.080022 kubelet[2884]: E1216 12:15:57.080003 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-695c59c79-wrg8j_calico-system(9de1278d-946f-4d27-9201-402be6a50469)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-695c59c79-wrg8j_calico-system(9de1278d-946f-4d27-9201-402be6a50469)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1a1715fd0017432dabb437fe08558f6c5be8ef79fbaa6d2fa12a60b4b4bdadf9\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-695c59c79-wrg8j" podUID="9de1278d-946f-4d27-9201-402be6a50469" Dec 16 12:15:57.717459 systemd[1]: run-netns-cni\x2d784c57cc\x2d0c61\x2d4604\x2d7ba4\x2daa564a8f5525.mount: Deactivated successfully. Dec 16 12:16:03.244201 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1782837624.mount: Deactivated successfully. Dec 16 12:16:03.266478 containerd[1680]: time="2025-12-16T12:16:03.266347689Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:16:03.268121 containerd[1680]: time="2025-12-16T12:16:03.267960297Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=150930912" Dec 16 12:16:03.270940 containerd[1680]: time="2025-12-16T12:16:03.270888712Z" level=info msg="ImageCreate event name:\"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:16:03.273276 containerd[1680]: time="2025-12-16T12:16:03.273223724Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:16:03.274199 containerd[1680]: time="2025-12-16T12:16:03.274164609Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"150934424\" in 6.517541879s" Dec 16 12:16:03.275210 containerd[1680]: time="2025-12-16T12:16:03.275125014Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\"" Dec 16 12:16:03.283026 containerd[1680]: time="2025-12-16T12:16:03.282985854Z" level=info msg="CreateContainer within sandbox \"b5a44ff26f01aa68aad09ac398ace646732aff07b1d6d3334760efd81ae3f543\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Dec 16 12:16:03.300966 containerd[1680]: time="2025-12-16T12:16:03.300893825Z" level=info msg="Container 8e10e29de6131553e4d1ad88356c745f080cfc9278693f4c891962a0fb92256a: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:16:03.312226 containerd[1680]: time="2025-12-16T12:16:03.312162683Z" level=info msg="CreateContainer within sandbox \"b5a44ff26f01aa68aad09ac398ace646732aff07b1d6d3334760efd81ae3f543\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"8e10e29de6131553e4d1ad88356c745f080cfc9278693f4c891962a0fb92256a\"" Dec 16 12:16:03.312849 containerd[1680]: time="2025-12-16T12:16:03.312817366Z" level=info msg="StartContainer for \"8e10e29de6131553e4d1ad88356c745f080cfc9278693f4c891962a0fb92256a\"" Dec 16 12:16:03.314480 containerd[1680]: time="2025-12-16T12:16:03.314451294Z" level=info msg="connecting to shim 8e10e29de6131553e4d1ad88356c745f080cfc9278693f4c891962a0fb92256a" address="unix:///run/containerd/s/46b4aab1dcd2e3d75ce022e3cf996172cada088dd310f91f38aa62000b2e8b87" protocol=ttrpc version=3 Dec 16 12:16:03.348199 systemd[1]: Started cri-containerd-8e10e29de6131553e4d1ad88356c745f080cfc9278693f4c891962a0fb92256a.scope - libcontainer container 8e10e29de6131553e4d1ad88356c745f080cfc9278693f4c891962a0fb92256a. Dec 16 12:16:03.421000 audit: BPF prog-id=172 op=LOAD Dec 16 12:16:03.423552 kernel: kauditd_printk_skb: 6 callbacks suppressed Dec 16 12:16:03.423636 kernel: audit: type=1334 audit(1765887363.421:575): prog-id=172 op=LOAD Dec 16 12:16:03.423660 kernel: audit: type=1300 audit(1765887363.421:575): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3403 pid=4018 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:03.421000 audit[4018]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3403 pid=4018 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:03.421000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3865313065323964653631333135353365346431616438383335366337 Dec 16 12:16:03.429270 kernel: audit: type=1327 audit(1765887363.421:575): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3865313065323964653631333135353365346431616438383335366337 Dec 16 12:16:03.421000 audit: BPF prog-id=173 op=LOAD Dec 16 12:16:03.430297 kernel: audit: type=1334 audit(1765887363.421:576): prog-id=173 op=LOAD Dec 16 12:16:03.421000 audit[4018]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3403 pid=4018 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:03.433344 kernel: audit: type=1300 audit(1765887363.421:576): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3403 pid=4018 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:03.421000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3865313065323964653631333135353365346431616438383335366337 Dec 16 12:16:03.436167 kernel: audit: type=1327 audit(1765887363.421:576): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3865313065323964653631333135353365346431616438383335366337 Dec 16 12:16:03.421000 audit: BPF prog-id=173 op=UNLOAD Dec 16 12:16:03.421000 audit[4018]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3403 pid=4018 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:03.439843 kernel: audit: type=1334 audit(1765887363.421:577): prog-id=173 op=UNLOAD Dec 16 12:16:03.439900 kernel: audit: type=1300 audit(1765887363.421:577): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3403 pid=4018 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:03.439921 kernel: audit: type=1327 audit(1765887363.421:577): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3865313065323964653631333135353365346431616438383335366337 Dec 16 12:16:03.421000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3865313065323964653631333135353365346431616438383335366337 Dec 16 12:16:03.421000 audit: BPF prog-id=172 op=UNLOAD Dec 16 12:16:03.443603 kernel: audit: type=1334 audit(1765887363.421:578): prog-id=172 op=UNLOAD Dec 16 12:16:03.421000 audit[4018]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3403 pid=4018 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:03.421000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3865313065323964653631333135353365346431616438383335366337 Dec 16 12:16:03.421000 audit: BPF prog-id=174 op=LOAD Dec 16 12:16:03.421000 audit[4018]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3403 pid=4018 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:03.421000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3865313065323964653631333135353365346431616438383335366337 Dec 16 12:16:03.455424 containerd[1680]: time="2025-12-16T12:16:03.455391093Z" level=info msg="StartContainer for \"8e10e29de6131553e4d1ad88356c745f080cfc9278693f4c891962a0fb92256a\" returns successfully" Dec 16 12:16:03.592026 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Dec 16 12:16:03.592139 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Dec 16 12:16:03.797612 kubelet[2884]: I1216 12:16:03.797519 2884 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-t8f6s" podStartSLOduration=1.611913555 podStartE2EDuration="22.797505398s" podCreationTimestamp="2025-12-16 12:15:41 +0000 UTC" firstStartedPulling="2025-12-16 12:15:42.09127122 +0000 UTC m=+21.710264680" lastFinishedPulling="2025-12-16 12:16:03.276863063 +0000 UTC m=+42.895856523" observedRunningTime="2025-12-16 12:16:03.796591313 +0000 UTC m=+43.415584813" watchObservedRunningTime="2025-12-16 12:16:03.797505398 +0000 UTC m=+43.416498858" Dec 16 12:16:03.818252 kubelet[2884]: I1216 12:16:03.818204 2884 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9de1278d-946f-4d27-9201-402be6a50469-whisker-ca-bundle\") pod \"9de1278d-946f-4d27-9201-402be6a50469\" (UID: \"9de1278d-946f-4d27-9201-402be6a50469\") " Dec 16 12:16:03.818252 kubelet[2884]: I1216 12:16:03.818258 2884 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9khtg\" (UniqueName: \"kubernetes.io/projected/9de1278d-946f-4d27-9201-402be6a50469-kube-api-access-9khtg\") pod \"9de1278d-946f-4d27-9201-402be6a50469\" (UID: \"9de1278d-946f-4d27-9201-402be6a50469\") " Dec 16 12:16:03.818436 kubelet[2884]: I1216 12:16:03.818294 2884 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9de1278d-946f-4d27-9201-402be6a50469-whisker-backend-key-pair\") pod \"9de1278d-946f-4d27-9201-402be6a50469\" (UID: \"9de1278d-946f-4d27-9201-402be6a50469\") " Dec 16 12:16:03.819261 kubelet[2884]: I1216 12:16:03.819188 2884 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9de1278d-946f-4d27-9201-402be6a50469-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "9de1278d-946f-4d27-9201-402be6a50469" (UID: "9de1278d-946f-4d27-9201-402be6a50469"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Dec 16 12:16:03.821428 kubelet[2884]: I1216 12:16:03.821376 2884 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9de1278d-946f-4d27-9201-402be6a50469-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "9de1278d-946f-4d27-9201-402be6a50469" (UID: "9de1278d-946f-4d27-9201-402be6a50469"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Dec 16 12:16:03.822347 kubelet[2884]: I1216 12:16:03.822311 2884 operation_generator.go:780] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9de1278d-946f-4d27-9201-402be6a50469-kube-api-access-9khtg" (OuterVolumeSpecName: "kube-api-access-9khtg") pod "9de1278d-946f-4d27-9201-402be6a50469" (UID: "9de1278d-946f-4d27-9201-402be6a50469"). InnerVolumeSpecName "kube-api-access-9khtg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Dec 16 12:16:03.918886 kubelet[2884]: I1216 12:16:03.918791 2884 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9de1278d-946f-4d27-9201-402be6a50469-whisker-ca-bundle\") on node \"ci-4547-0-0-0-5b424f63c8\" DevicePath \"\"" Dec 16 12:16:03.918886 kubelet[2884]: I1216 12:16:03.918866 2884 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9khtg\" (UniqueName: \"kubernetes.io/projected/9de1278d-946f-4d27-9201-402be6a50469-kube-api-access-9khtg\") on node \"ci-4547-0-0-0-5b424f63c8\" DevicePath \"\"" Dec 16 12:16:03.918886 kubelet[2884]: I1216 12:16:03.918897 2884 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9de1278d-946f-4d27-9201-402be6a50469-whisker-backend-key-pair\") on node \"ci-4547-0-0-0-5b424f63c8\" DevicePath \"\"" Dec 16 12:16:04.080236 systemd[1]: Removed slice kubepods-besteffort-pod9de1278d_946f_4d27_9201_402be6a50469.slice - libcontainer container kubepods-besteffort-pod9de1278d_946f_4d27_9201_402be6a50469.slice. Dec 16 12:16:04.133619 systemd[1]: Created slice kubepods-besteffort-pod9883af1d_f9ff_4212_ac38_34ecc575631c.slice - libcontainer container kubepods-besteffort-pod9883af1d_f9ff_4212_ac38_34ecc575631c.slice. Dec 16 12:16:04.221009 kubelet[2884]: I1216 12:16:04.220873 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2s2rj\" (UniqueName: \"kubernetes.io/projected/9883af1d-f9ff-4212-ac38-34ecc575631c-kube-api-access-2s2rj\") pod \"whisker-69fc96cf55-8lskp\" (UID: \"9883af1d-f9ff-4212-ac38-34ecc575631c\") " pod="calico-system/whisker-69fc96cf55-8lskp" Dec 16 12:16:04.221170 kubelet[2884]: I1216 12:16:04.221023 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9883af1d-f9ff-4212-ac38-34ecc575631c-whisker-backend-key-pair\") pod \"whisker-69fc96cf55-8lskp\" (UID: \"9883af1d-f9ff-4212-ac38-34ecc575631c\") " pod="calico-system/whisker-69fc96cf55-8lskp" Dec 16 12:16:04.221170 kubelet[2884]: I1216 12:16:04.221068 2884 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9883af1d-f9ff-4212-ac38-34ecc575631c-whisker-ca-bundle\") pod \"whisker-69fc96cf55-8lskp\" (UID: \"9883af1d-f9ff-4212-ac38-34ecc575631c\") " pod="calico-system/whisker-69fc96cf55-8lskp" Dec 16 12:16:04.245162 systemd[1]: var-lib-kubelet-pods-9de1278d\x2d946f\x2d4d27\x2d9201\x2d402be6a50469-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d9khtg.mount: Deactivated successfully. Dec 16 12:16:04.245246 systemd[1]: var-lib-kubelet-pods-9de1278d\x2d946f\x2d4d27\x2d9201\x2d402be6a50469-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Dec 16 12:16:04.438017 containerd[1680]: time="2025-12-16T12:16:04.437984224Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-69fc96cf55-8lskp,Uid:9883af1d-f9ff-4212-ac38-34ecc575631c,Namespace:calico-system,Attempt:0,}" Dec 16 12:16:04.529406 kubelet[2884]: I1216 12:16:04.529353 2884 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 12:16:04.560000 audit[4131]: NETFILTER_CFG table=filter:119 family=2 entries=21 op=nft_register_rule pid=4131 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:16:04.560000 audit[4131]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffc8aac600 a2=0 a3=1 items=0 ppid=2997 pid=4131 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:04.560000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:16:04.566000 audit[4131]: NETFILTER_CFG table=nat:120 family=2 entries=19 op=nft_register_chain pid=4131 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:16:04.566000 audit[4131]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6276 a0=3 a1=ffffc8aac600 a2=0 a3=1 items=0 ppid=2997 pid=4131 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:04.566000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:16:04.586876 systemd-networkd[1582]: cali2e8fa9d3758: Link UP Dec 16 12:16:04.587041 systemd-networkd[1582]: cali2e8fa9d3758: Gained carrier Dec 16 12:16:04.600222 containerd[1680]: 2025-12-16 12:16:04.465 [INFO][4108] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 12:16:04.600222 containerd[1680]: 2025-12-16 12:16:04.483 [INFO][4108] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--0--5b424f63c8-k8s-whisker--69fc96cf55--8lskp-eth0 whisker-69fc96cf55- calico-system 9883af1d-f9ff-4212-ac38-34ecc575631c 932 0 2025-12-16 12:16:04 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:69fc96cf55 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4547-0-0-0-5b424f63c8 whisker-69fc96cf55-8lskp eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali2e8fa9d3758 [] [] }} ContainerID="0851e364a5ce132f9abdb00e355eb684ba86ec956229a823f939c53f465ecd9b" Namespace="calico-system" Pod="whisker-69fc96cf55-8lskp" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-whisker--69fc96cf55--8lskp-" Dec 16 12:16:04.600222 containerd[1680]: 2025-12-16 12:16:04.483 [INFO][4108] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0851e364a5ce132f9abdb00e355eb684ba86ec956229a823f939c53f465ecd9b" Namespace="calico-system" Pod="whisker-69fc96cf55-8lskp" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-whisker--69fc96cf55--8lskp-eth0" Dec 16 12:16:04.600222 containerd[1680]: 2025-12-16 12:16:04.529 [INFO][4122] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0851e364a5ce132f9abdb00e355eb684ba86ec956229a823f939c53f465ecd9b" HandleID="k8s-pod-network.0851e364a5ce132f9abdb00e355eb684ba86ec956229a823f939c53f465ecd9b" Workload="ci--4547--0--0--0--5b424f63c8-k8s-whisker--69fc96cf55--8lskp-eth0" Dec 16 12:16:04.600429 containerd[1680]: 2025-12-16 12:16:04.529 [INFO][4122] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="0851e364a5ce132f9abdb00e355eb684ba86ec956229a823f939c53f465ecd9b" HandleID="k8s-pod-network.0851e364a5ce132f9abdb00e355eb684ba86ec956229a823f939c53f465ecd9b" Workload="ci--4547--0--0--0--5b424f63c8-k8s-whisker--69fc96cf55--8lskp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000323ef0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-0-5b424f63c8", "pod":"whisker-69fc96cf55-8lskp", "timestamp":"2025-12-16 12:16:04.529433931 +0000 UTC"}, Hostname:"ci-4547-0-0-0-5b424f63c8", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:16:04.600429 containerd[1680]: 2025-12-16 12:16:04.529 [INFO][4122] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:16:04.600429 containerd[1680]: 2025-12-16 12:16:04.529 [INFO][4122] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:16:04.600429 containerd[1680]: 2025-12-16 12:16:04.529 [INFO][4122] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-0-5b424f63c8' Dec 16 12:16:04.600429 containerd[1680]: 2025-12-16 12:16:04.541 [INFO][4122] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0851e364a5ce132f9abdb00e355eb684ba86ec956229a823f939c53f465ecd9b" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:04.600429 containerd[1680]: 2025-12-16 12:16:04.549 [INFO][4122] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:04.600429 containerd[1680]: 2025-12-16 12:16:04.558 [INFO][4122] ipam/ipam.go 511: Trying affinity for 192.168.67.64/26 host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:04.600429 containerd[1680]: 2025-12-16 12:16:04.562 [INFO][4122] ipam/ipam.go 158: Attempting to load block cidr=192.168.67.64/26 host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:04.600429 containerd[1680]: 2025-12-16 12:16:04.564 [INFO][4122] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.67.64/26 host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:04.600619 containerd[1680]: 2025-12-16 12:16:04.564 [INFO][4122] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.67.64/26 handle="k8s-pod-network.0851e364a5ce132f9abdb00e355eb684ba86ec956229a823f939c53f465ecd9b" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:04.600619 containerd[1680]: 2025-12-16 12:16:04.568 [INFO][4122] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.0851e364a5ce132f9abdb00e355eb684ba86ec956229a823f939c53f465ecd9b Dec 16 12:16:04.600619 containerd[1680]: 2025-12-16 12:16:04.572 [INFO][4122] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.67.64/26 handle="k8s-pod-network.0851e364a5ce132f9abdb00e355eb684ba86ec956229a823f939c53f465ecd9b" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:04.600619 containerd[1680]: 2025-12-16 12:16:04.578 [INFO][4122] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.67.65/26] block=192.168.67.64/26 handle="k8s-pod-network.0851e364a5ce132f9abdb00e355eb684ba86ec956229a823f939c53f465ecd9b" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:04.600619 containerd[1680]: 2025-12-16 12:16:04.578 [INFO][4122] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.67.65/26] handle="k8s-pod-network.0851e364a5ce132f9abdb00e355eb684ba86ec956229a823f939c53f465ecd9b" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:04.600619 containerd[1680]: 2025-12-16 12:16:04.578 [INFO][4122] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:16:04.600619 containerd[1680]: 2025-12-16 12:16:04.578 [INFO][4122] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.67.65/26] IPv6=[] ContainerID="0851e364a5ce132f9abdb00e355eb684ba86ec956229a823f939c53f465ecd9b" HandleID="k8s-pod-network.0851e364a5ce132f9abdb00e355eb684ba86ec956229a823f939c53f465ecd9b" Workload="ci--4547--0--0--0--5b424f63c8-k8s-whisker--69fc96cf55--8lskp-eth0" Dec 16 12:16:04.600754 containerd[1680]: 2025-12-16 12:16:04.581 [INFO][4108] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0851e364a5ce132f9abdb00e355eb684ba86ec956229a823f939c53f465ecd9b" Namespace="calico-system" Pod="whisker-69fc96cf55-8lskp" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-whisker--69fc96cf55--8lskp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--0--5b424f63c8-k8s-whisker--69fc96cf55--8lskp-eth0", GenerateName:"whisker-69fc96cf55-", Namespace:"calico-system", SelfLink:"", UID:"9883af1d-f9ff-4212-ac38-34ecc575631c", ResourceVersion:"932", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 16, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"69fc96cf55", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-0-5b424f63c8", ContainerID:"", Pod:"whisker-69fc96cf55-8lskp", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.67.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali2e8fa9d3758", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:16:04.600754 containerd[1680]: 2025-12-16 12:16:04.581 [INFO][4108] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.67.65/32] ContainerID="0851e364a5ce132f9abdb00e355eb684ba86ec956229a823f939c53f465ecd9b" Namespace="calico-system" Pod="whisker-69fc96cf55-8lskp" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-whisker--69fc96cf55--8lskp-eth0" Dec 16 12:16:04.600824 containerd[1680]: 2025-12-16 12:16:04.581 [INFO][4108] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2e8fa9d3758 ContainerID="0851e364a5ce132f9abdb00e355eb684ba86ec956229a823f939c53f465ecd9b" Namespace="calico-system" Pod="whisker-69fc96cf55-8lskp" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-whisker--69fc96cf55--8lskp-eth0" Dec 16 12:16:04.600824 containerd[1680]: 2025-12-16 12:16:04.588 [INFO][4108] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0851e364a5ce132f9abdb00e355eb684ba86ec956229a823f939c53f465ecd9b" Namespace="calico-system" Pod="whisker-69fc96cf55-8lskp" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-whisker--69fc96cf55--8lskp-eth0" Dec 16 12:16:04.600861 containerd[1680]: 2025-12-16 12:16:04.588 [INFO][4108] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0851e364a5ce132f9abdb00e355eb684ba86ec956229a823f939c53f465ecd9b" Namespace="calico-system" Pod="whisker-69fc96cf55-8lskp" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-whisker--69fc96cf55--8lskp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--0--5b424f63c8-k8s-whisker--69fc96cf55--8lskp-eth0", GenerateName:"whisker-69fc96cf55-", Namespace:"calico-system", SelfLink:"", UID:"9883af1d-f9ff-4212-ac38-34ecc575631c", ResourceVersion:"932", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 16, 4, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"69fc96cf55", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-0-5b424f63c8", ContainerID:"0851e364a5ce132f9abdb00e355eb684ba86ec956229a823f939c53f465ecd9b", Pod:"whisker-69fc96cf55-8lskp", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.67.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali2e8fa9d3758", MAC:"ea:c7:b0:0f:f6:9c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:16:04.600915 containerd[1680]: 2025-12-16 12:16:04.597 [INFO][4108] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0851e364a5ce132f9abdb00e355eb684ba86ec956229a823f939c53f465ecd9b" Namespace="calico-system" Pod="whisker-69fc96cf55-8lskp" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-whisker--69fc96cf55--8lskp-eth0" Dec 16 12:16:04.626609 containerd[1680]: time="2025-12-16T12:16:04.626558266Z" level=info msg="connecting to shim 0851e364a5ce132f9abdb00e355eb684ba86ec956229a823f939c53f465ecd9b" address="unix:///run/containerd/s/676de1eea23faca8e5e0b0ce3bd5a348d77b795055c8fa6404be9066ea2c6a55" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:16:04.651354 kubelet[2884]: I1216 12:16:04.651321 2884 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9de1278d-946f-4d27-9201-402be6a50469" path="/var/lib/kubelet/pods/9de1278d-946f-4d27-9201-402be6a50469/volumes" Dec 16 12:16:04.654514 systemd[1]: Started cri-containerd-0851e364a5ce132f9abdb00e355eb684ba86ec956229a823f939c53f465ecd9b.scope - libcontainer container 0851e364a5ce132f9abdb00e355eb684ba86ec956229a823f939c53f465ecd9b. Dec 16 12:16:04.663000 audit: BPF prog-id=175 op=LOAD Dec 16 12:16:04.664000 audit: BPF prog-id=176 op=LOAD Dec 16 12:16:04.664000 audit[4160]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4149 pid=4160 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:04.664000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038353165333634613563653133326639616264623030653335356562 Dec 16 12:16:04.664000 audit: BPF prog-id=176 op=UNLOAD Dec 16 12:16:04.664000 audit[4160]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4149 pid=4160 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:04.664000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038353165333634613563653133326639616264623030653335356562 Dec 16 12:16:04.664000 audit: BPF prog-id=177 op=LOAD Dec 16 12:16:04.664000 audit[4160]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4149 pid=4160 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:04.664000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038353165333634613563653133326639616264623030653335356562 Dec 16 12:16:04.664000 audit: BPF prog-id=178 op=LOAD Dec 16 12:16:04.664000 audit[4160]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4149 pid=4160 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:04.664000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038353165333634613563653133326639616264623030653335356562 Dec 16 12:16:04.664000 audit: BPF prog-id=178 op=UNLOAD Dec 16 12:16:04.664000 audit[4160]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4149 pid=4160 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:04.664000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038353165333634613563653133326639616264623030653335356562 Dec 16 12:16:04.664000 audit: BPF prog-id=177 op=UNLOAD Dec 16 12:16:04.664000 audit[4160]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4149 pid=4160 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:04.664000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038353165333634613563653133326639616264623030653335356562 Dec 16 12:16:04.664000 audit: BPF prog-id=179 op=LOAD Dec 16 12:16:04.664000 audit[4160]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4149 pid=4160 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:04.664000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3038353165333634613563653133326639616264623030653335356562 Dec 16 12:16:04.690832 containerd[1680]: time="2025-12-16T12:16:04.690720953Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-69fc96cf55-8lskp,Uid:9883af1d-f9ff-4212-ac38-34ecc575631c,Namespace:calico-system,Attempt:0,} returns sandbox id \"0851e364a5ce132f9abdb00e355eb684ba86ec956229a823f939c53f465ecd9b\"" Dec 16 12:16:04.692888 containerd[1680]: time="2025-12-16T12:16:04.692861684Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:16:05.042041 containerd[1680]: time="2025-12-16T12:16:05.041799904Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:05.043988 containerd[1680]: time="2025-12-16T12:16:05.043948275Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:16:05.044153 containerd[1680]: time="2025-12-16T12:16:05.044021475Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:05.044525 kubelet[2884]: E1216 12:16:05.044471 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:16:05.044780 kubelet[2884]: E1216 12:16:05.044534 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:16:05.044818 kubelet[2884]: E1216 12:16:05.044739 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:bce81bb11dd34fd8b7a0b5197b60303d,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2s2rj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-69fc96cf55-8lskp_calico-system(9883af1d-f9ff-4212-ac38-34ecc575631c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:05.047496 containerd[1680]: time="2025-12-16T12:16:05.047464132Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:16:05.048000 audit: BPF prog-id=180 op=LOAD Dec 16 12:16:05.048000 audit[4330]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffffe9a18a8 a2=98 a3=fffffe9a1898 items=0 ppid=4238 pid=4330 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.048000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:16:05.048000 audit: BPF prog-id=180 op=UNLOAD Dec 16 12:16:05.048000 audit[4330]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=fffffe9a1878 a3=0 items=0 ppid=4238 pid=4330 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.048000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:16:05.048000 audit: BPF prog-id=181 op=LOAD Dec 16 12:16:05.048000 audit[4330]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffffe9a1758 a2=74 a3=95 items=0 ppid=4238 pid=4330 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.048000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:16:05.048000 audit: BPF prog-id=181 op=UNLOAD Dec 16 12:16:05.048000 audit[4330]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4238 pid=4330 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.048000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:16:05.048000 audit: BPF prog-id=182 op=LOAD Dec 16 12:16:05.048000 audit[4330]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffffe9a1788 a2=40 a3=fffffe9a17b8 items=0 ppid=4238 pid=4330 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.048000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:16:05.048000 audit: BPF prog-id=182 op=UNLOAD Dec 16 12:16:05.048000 audit[4330]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=fffffe9a17b8 items=0 ppid=4238 pid=4330 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.048000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:16:05.050000 audit: BPF prog-id=183 op=LOAD Dec 16 12:16:05.050000 audit[4331]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffffc0dd6f8 a2=98 a3=fffffc0dd6e8 items=0 ppid=4238 pid=4331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.050000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:16:05.051000 audit: BPF prog-id=183 op=UNLOAD Dec 16 12:16:05.051000 audit[4331]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=fffffc0dd6c8 a3=0 items=0 ppid=4238 pid=4331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.051000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:16:05.051000 audit: BPF prog-id=184 op=LOAD Dec 16 12:16:05.051000 audit[4331]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=fffffc0dd388 a2=74 a3=95 items=0 ppid=4238 pid=4331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.051000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:16:05.051000 audit: BPF prog-id=184 op=UNLOAD Dec 16 12:16:05.051000 audit[4331]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4238 pid=4331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.051000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:16:05.051000 audit: BPF prog-id=185 op=LOAD Dec 16 12:16:05.051000 audit[4331]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=fffffc0dd3e8 a2=94 a3=2 items=0 ppid=4238 pid=4331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.051000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:16:05.051000 audit: BPF prog-id=185 op=UNLOAD Dec 16 12:16:05.051000 audit[4331]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4238 pid=4331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.051000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:16:05.158000 audit: BPF prog-id=186 op=LOAD Dec 16 12:16:05.158000 audit[4331]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=fffffc0dd3a8 a2=40 a3=fffffc0dd3d8 items=0 ppid=4238 pid=4331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.158000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:16:05.158000 audit: BPF prog-id=186 op=UNLOAD Dec 16 12:16:05.158000 audit[4331]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=fffffc0dd3d8 items=0 ppid=4238 pid=4331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.158000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:16:05.168000 audit: BPF prog-id=187 op=LOAD Dec 16 12:16:05.168000 audit[4331]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=fffffc0dd3b8 a2=94 a3=4 items=0 ppid=4238 pid=4331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.168000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:16:05.169000 audit: BPF prog-id=187 op=UNLOAD Dec 16 12:16:05.169000 audit[4331]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4238 pid=4331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.169000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:16:05.169000 audit: BPF prog-id=188 op=LOAD Dec 16 12:16:05.169000 audit[4331]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=fffffc0dd1f8 a2=94 a3=5 items=0 ppid=4238 pid=4331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.169000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:16:05.169000 audit: BPF prog-id=188 op=UNLOAD Dec 16 12:16:05.169000 audit[4331]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4238 pid=4331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.169000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:16:05.169000 audit: BPF prog-id=189 op=LOAD Dec 16 12:16:05.169000 audit[4331]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=fffffc0dd428 a2=94 a3=6 items=0 ppid=4238 pid=4331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.169000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:16:05.169000 audit: BPF prog-id=189 op=UNLOAD Dec 16 12:16:05.169000 audit[4331]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4238 pid=4331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.169000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:16:05.170000 audit: BPF prog-id=190 op=LOAD Dec 16 12:16:05.170000 audit[4331]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=fffffc0dcbf8 a2=94 a3=83 items=0 ppid=4238 pid=4331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.170000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:16:05.170000 audit: BPF prog-id=191 op=LOAD Dec 16 12:16:05.170000 audit[4331]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=fffffc0dc9b8 a2=94 a3=2 items=0 ppid=4238 pid=4331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.170000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:16:05.171000 audit: BPF prog-id=191 op=UNLOAD Dec 16 12:16:05.171000 audit[4331]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4238 pid=4331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.171000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:16:05.171000 audit: BPF prog-id=190 op=UNLOAD Dec 16 12:16:05.171000 audit[4331]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=1f0a3620 a3=1f096b00 items=0 ppid=4238 pid=4331 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.171000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:16:05.181000 audit: BPF prog-id=192 op=LOAD Dec 16 12:16:05.181000 audit[4354]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffff9996a18 a2=98 a3=fffff9996a08 items=0 ppid=4238 pid=4354 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.181000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:16:05.181000 audit: BPF prog-id=192 op=UNLOAD Dec 16 12:16:05.181000 audit[4354]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=fffff99969e8 a3=0 items=0 ppid=4238 pid=4354 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.181000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:16:05.181000 audit: BPF prog-id=193 op=LOAD Dec 16 12:16:05.181000 audit[4354]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffff99968c8 a2=74 a3=95 items=0 ppid=4238 pid=4354 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.181000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:16:05.181000 audit: BPF prog-id=193 op=UNLOAD Dec 16 12:16:05.181000 audit[4354]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4238 pid=4354 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.181000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:16:05.181000 audit: BPF prog-id=194 op=LOAD Dec 16 12:16:05.181000 audit[4354]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffff99968f8 a2=40 a3=fffff9996928 items=0 ppid=4238 pid=4354 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.181000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:16:05.181000 audit: BPF prog-id=194 op=UNLOAD Dec 16 12:16:05.181000 audit[4354]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=fffff9996928 items=0 ppid=4238 pid=4354 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.181000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:16:05.240757 systemd-networkd[1582]: vxlan.calico: Link UP Dec 16 12:16:05.240771 systemd-networkd[1582]: vxlan.calico: Gained carrier Dec 16 12:16:05.258000 audit: BPF prog-id=195 op=LOAD Dec 16 12:16:05.258000 audit[4380]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffe4dae798 a2=98 a3=ffffe4dae788 items=0 ppid=4238 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.258000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:16:05.258000 audit: BPF prog-id=195 op=UNLOAD Dec 16 12:16:05.258000 audit[4380]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffe4dae768 a3=0 items=0 ppid=4238 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.258000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:16:05.258000 audit: BPF prog-id=196 op=LOAD Dec 16 12:16:05.258000 audit[4380]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffe4dae478 a2=74 a3=95 items=0 ppid=4238 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.258000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:16:05.258000 audit: BPF prog-id=196 op=UNLOAD Dec 16 12:16:05.258000 audit[4380]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4238 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.258000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:16:05.258000 audit: BPF prog-id=197 op=LOAD Dec 16 12:16:05.258000 audit[4380]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffe4dae4d8 a2=94 a3=2 items=0 ppid=4238 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.258000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:16:05.258000 audit: BPF prog-id=197 op=UNLOAD Dec 16 12:16:05.258000 audit[4380]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=70 a3=2 items=0 ppid=4238 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.258000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:16:05.258000 audit: BPF prog-id=198 op=LOAD Dec 16 12:16:05.258000 audit[4380]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffe4dae358 a2=40 a3=ffffe4dae388 items=0 ppid=4238 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.258000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:16:05.258000 audit: BPF prog-id=198 op=UNLOAD Dec 16 12:16:05.258000 audit[4380]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=40 a3=ffffe4dae388 items=0 ppid=4238 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.258000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:16:05.258000 audit: BPF prog-id=199 op=LOAD Dec 16 12:16:05.258000 audit[4380]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffe4dae4a8 a2=94 a3=b7 items=0 ppid=4238 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.258000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:16:05.259000 audit: BPF prog-id=199 op=UNLOAD Dec 16 12:16:05.259000 audit[4380]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=b7 items=0 ppid=4238 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.259000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:16:05.260000 audit: BPF prog-id=200 op=LOAD Dec 16 12:16:05.260000 audit[4380]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffe4dadb58 a2=94 a3=2 items=0 ppid=4238 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.260000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:16:05.260000 audit: BPF prog-id=200 op=UNLOAD Dec 16 12:16:05.260000 audit[4380]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=2 items=0 ppid=4238 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.260000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:16:05.260000 audit: BPF prog-id=201 op=LOAD Dec 16 12:16:05.260000 audit[4380]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffe4dadce8 a2=94 a3=30 items=0 ppid=4238 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.260000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:16:05.266000 audit: BPF prog-id=202 op=LOAD Dec 16 12:16:05.266000 audit[4388]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffc4354da8 a2=98 a3=ffffc4354d98 items=0 ppid=4238 pid=4388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.266000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:16:05.266000 audit: BPF prog-id=202 op=UNLOAD Dec 16 12:16:05.266000 audit[4388]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffc4354d78 a3=0 items=0 ppid=4238 pid=4388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.266000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:16:05.266000 audit: BPF prog-id=203 op=LOAD Dec 16 12:16:05.266000 audit[4388]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffc4354a38 a2=74 a3=95 items=0 ppid=4238 pid=4388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.266000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:16:05.266000 audit: BPF prog-id=203 op=UNLOAD Dec 16 12:16:05.266000 audit[4388]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4238 pid=4388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.266000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:16:05.266000 audit: BPF prog-id=204 op=LOAD Dec 16 12:16:05.266000 audit[4388]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffc4354a98 a2=94 a3=2 items=0 ppid=4238 pid=4388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.266000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:16:05.266000 audit: BPF prog-id=204 op=UNLOAD Dec 16 12:16:05.266000 audit[4388]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4238 pid=4388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.266000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:16:05.368000 audit: BPF prog-id=205 op=LOAD Dec 16 12:16:05.368000 audit[4388]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffc4354a58 a2=40 a3=ffffc4354a88 items=0 ppid=4238 pid=4388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.368000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:16:05.369000 audit: BPF prog-id=205 op=UNLOAD Dec 16 12:16:05.369000 audit[4388]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=ffffc4354a88 items=0 ppid=4238 pid=4388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.369000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:16:05.378000 audit: BPF prog-id=206 op=LOAD Dec 16 12:16:05.378000 audit[4388]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffc4354a68 a2=94 a3=4 items=0 ppid=4238 pid=4388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.378000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:16:05.378000 audit: BPF prog-id=206 op=UNLOAD Dec 16 12:16:05.378000 audit[4388]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4238 pid=4388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.378000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:16:05.379000 audit: BPF prog-id=207 op=LOAD Dec 16 12:16:05.379000 audit[4388]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffc43548a8 a2=94 a3=5 items=0 ppid=4238 pid=4388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.379000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:16:05.379000 audit: BPF prog-id=207 op=UNLOAD Dec 16 12:16:05.379000 audit[4388]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4238 pid=4388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.379000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:16:05.379000 audit: BPF prog-id=208 op=LOAD Dec 16 12:16:05.379000 audit[4388]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffc4354ad8 a2=94 a3=6 items=0 ppid=4238 pid=4388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.379000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:16:05.379000 audit: BPF prog-id=208 op=UNLOAD Dec 16 12:16:05.379000 audit[4388]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4238 pid=4388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.379000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:16:05.379000 audit: BPF prog-id=209 op=LOAD Dec 16 12:16:05.379000 audit[4388]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffc43542a8 a2=94 a3=83 items=0 ppid=4238 pid=4388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.379000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:16:05.379000 audit: BPF prog-id=210 op=LOAD Dec 16 12:16:05.379000 audit[4388]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=ffffc4354068 a2=94 a3=2 items=0 ppid=4238 pid=4388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.379000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:16:05.379000 audit: BPF prog-id=210 op=UNLOAD Dec 16 12:16:05.379000 audit[4388]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4238 pid=4388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.379000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:16:05.380000 audit: BPF prog-id=209 op=UNLOAD Dec 16 12:16:05.380000 audit[4388]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=e84620 a3=e77b00 items=0 ppid=4238 pid=4388 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.380000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:16:05.383744 containerd[1680]: time="2025-12-16T12:16:05.383694887Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:05.385582 containerd[1680]: time="2025-12-16T12:16:05.385536777Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:16:05.385985 containerd[1680]: time="2025-12-16T12:16:05.385628017Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:05.386392 kubelet[2884]: E1216 12:16:05.385784 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:16:05.386392 kubelet[2884]: E1216 12:16:05.385825 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:16:05.386392 kubelet[2884]: E1216 12:16:05.385930 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2s2rj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-69fc96cf55-8lskp_calico-system(9883af1d-f9ff-4212-ac38-34ecc575631c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:05.387139 kubelet[2884]: E1216 12:16:05.387105 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-69fc96cf55-8lskp" podUID="9883af1d-f9ff-4212-ac38-34ecc575631c" Dec 16 12:16:05.388000 audit: BPF prog-id=201 op=UNLOAD Dec 16 12:16:05.388000 audit[4238]: SYSCALL arch=c00000b7 syscall=35 success=yes exit=0 a0=ffffffffffffff9c a1=40007decc0 a2=0 a3=0 items=0 ppid=4222 pid=4238 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.388000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Dec 16 12:16:05.433000 audit[4412]: NETFILTER_CFG table=nat:121 family=2 entries=15 op=nft_register_chain pid=4412 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:16:05.433000 audit[4412]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5084 a0=3 a1=ffffc2ab02d0 a2=0 a3=ffff9a942fa8 items=0 ppid=4238 pid=4412 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.433000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:16:05.434000 audit[4414]: NETFILTER_CFG table=mangle:122 family=2 entries=16 op=nft_register_chain pid=4414 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:16:05.434000 audit[4414]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6868 a0=3 a1=ffffe6d45750 a2=0 a3=ffff7f4cdfa8 items=0 ppid=4238 pid=4414 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.434000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:16:05.441000 audit[4411]: NETFILTER_CFG table=raw:123 family=2 entries=21 op=nft_register_chain pid=4411 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:16:05.441000 audit[4411]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8452 a0=3 a1=fffffefabdd0 a2=0 a3=ffffa1268fa8 items=0 ppid=4238 pid=4411 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.441000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:16:05.442000 audit[4413]: NETFILTER_CFG table=filter:124 family=2 entries=94 op=nft_register_chain pid=4413 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:16:05.442000 audit[4413]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=53116 a0=3 a1=ffffef4cfc70 a2=0 a3=ffffa35c0fa8 items=0 ppid=4238 pid=4413 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.442000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:16:05.642455 systemd-networkd[1582]: cali2e8fa9d3758: Gained IPv6LL Dec 16 12:16:05.781470 kubelet[2884]: E1216 12:16:05.781218 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-69fc96cf55-8lskp" podUID="9883af1d-f9ff-4212-ac38-34ecc575631c" Dec 16 12:16:05.803000 audit[4424]: NETFILTER_CFG table=filter:125 family=2 entries=20 op=nft_register_rule pid=4424 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:16:05.803000 audit[4424]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffd1947050 a2=0 a3=1 items=0 ppid=2997 pid=4424 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.803000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:16:05.808000 audit[4424]: NETFILTER_CFG table=nat:126 family=2 entries=14 op=nft_register_rule pid=4424 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:16:05.808000 audit[4424]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffd1947050 a2=0 a3=1 items=0 ppid=2997 pid=4424 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:05.808000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:16:06.730409 systemd-networkd[1582]: vxlan.calico: Gained IPv6LL Dec 16 12:16:07.648866 containerd[1680]: time="2025-12-16T12:16:07.648805679Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-n6pr2,Uid:4f3aa479-c591-4d08-8703-ea7e260802a6,Namespace:kube-system,Attempt:0,}" Dec 16 12:16:07.759065 systemd-networkd[1582]: cali796a623e354: Link UP Dec 16 12:16:07.759270 systemd-networkd[1582]: cali796a623e354: Gained carrier Dec 16 12:16:07.775150 containerd[1680]: 2025-12-16 12:16:07.689 [INFO][4429] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--0--5b424f63c8-k8s-coredns--668d6bf9bc--n6pr2-eth0 coredns-668d6bf9bc- kube-system 4f3aa479-c591-4d08-8703-ea7e260802a6 849 0 2025-12-16 12:15:26 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4547-0-0-0-5b424f63c8 coredns-668d6bf9bc-n6pr2 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali796a623e354 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="a6bed098ccff0ce5c8addd1958d825b0676a222e583f34abee366fd2f50c505e" Namespace="kube-system" Pod="coredns-668d6bf9bc-n6pr2" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-coredns--668d6bf9bc--n6pr2-" Dec 16 12:16:07.775150 containerd[1680]: 2025-12-16 12:16:07.690 [INFO][4429] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="a6bed098ccff0ce5c8addd1958d825b0676a222e583f34abee366fd2f50c505e" Namespace="kube-system" Pod="coredns-668d6bf9bc-n6pr2" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-coredns--668d6bf9bc--n6pr2-eth0" Dec 16 12:16:07.775150 containerd[1680]: 2025-12-16 12:16:07.714 [INFO][4443] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="a6bed098ccff0ce5c8addd1958d825b0676a222e583f34abee366fd2f50c505e" HandleID="k8s-pod-network.a6bed098ccff0ce5c8addd1958d825b0676a222e583f34abee366fd2f50c505e" Workload="ci--4547--0--0--0--5b424f63c8-k8s-coredns--668d6bf9bc--n6pr2-eth0" Dec 16 12:16:07.775712 containerd[1680]: 2025-12-16 12:16:07.714 [INFO][4443] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="a6bed098ccff0ce5c8addd1958d825b0676a222e583f34abee366fd2f50c505e" HandleID="k8s-pod-network.a6bed098ccff0ce5c8addd1958d825b0676a222e583f34abee366fd2f50c505e" Workload="ci--4547--0--0--0--5b424f63c8-k8s-coredns--668d6bf9bc--n6pr2-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000511da0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4547-0-0-0-5b424f63c8", "pod":"coredns-668d6bf9bc-n6pr2", "timestamp":"2025-12-16 12:16:07.714501894 +0000 UTC"}, Hostname:"ci-4547-0-0-0-5b424f63c8", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:16:07.775712 containerd[1680]: 2025-12-16 12:16:07.714 [INFO][4443] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:16:07.775712 containerd[1680]: 2025-12-16 12:16:07.714 [INFO][4443] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:16:07.775712 containerd[1680]: 2025-12-16 12:16:07.714 [INFO][4443] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-0-5b424f63c8' Dec 16 12:16:07.775712 containerd[1680]: 2025-12-16 12:16:07.724 [INFO][4443] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.a6bed098ccff0ce5c8addd1958d825b0676a222e583f34abee366fd2f50c505e" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:07.775712 containerd[1680]: 2025-12-16 12:16:07.728 [INFO][4443] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:07.775712 containerd[1680]: 2025-12-16 12:16:07.734 [INFO][4443] ipam/ipam.go 511: Trying affinity for 192.168.67.64/26 host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:07.775712 containerd[1680]: 2025-12-16 12:16:07.736 [INFO][4443] ipam/ipam.go 158: Attempting to load block cidr=192.168.67.64/26 host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:07.775712 containerd[1680]: 2025-12-16 12:16:07.738 [INFO][4443] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.67.64/26 host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:07.776143 containerd[1680]: 2025-12-16 12:16:07.738 [INFO][4443] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.67.64/26 handle="k8s-pod-network.a6bed098ccff0ce5c8addd1958d825b0676a222e583f34abee366fd2f50c505e" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:07.776143 containerd[1680]: 2025-12-16 12:16:07.740 [INFO][4443] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.a6bed098ccff0ce5c8addd1958d825b0676a222e583f34abee366fd2f50c505e Dec 16 12:16:07.776143 containerd[1680]: 2025-12-16 12:16:07.747 [INFO][4443] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.67.64/26 handle="k8s-pod-network.a6bed098ccff0ce5c8addd1958d825b0676a222e583f34abee366fd2f50c505e" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:07.776143 containerd[1680]: 2025-12-16 12:16:07.753 [INFO][4443] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.67.66/26] block=192.168.67.64/26 handle="k8s-pod-network.a6bed098ccff0ce5c8addd1958d825b0676a222e583f34abee366fd2f50c505e" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:07.776143 containerd[1680]: 2025-12-16 12:16:07.753 [INFO][4443] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.67.66/26] handle="k8s-pod-network.a6bed098ccff0ce5c8addd1958d825b0676a222e583f34abee366fd2f50c505e" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:07.776143 containerd[1680]: 2025-12-16 12:16:07.753 [INFO][4443] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:16:07.776143 containerd[1680]: 2025-12-16 12:16:07.753 [INFO][4443] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.67.66/26] IPv6=[] ContainerID="a6bed098ccff0ce5c8addd1958d825b0676a222e583f34abee366fd2f50c505e" HandleID="k8s-pod-network.a6bed098ccff0ce5c8addd1958d825b0676a222e583f34abee366fd2f50c505e" Workload="ci--4547--0--0--0--5b424f63c8-k8s-coredns--668d6bf9bc--n6pr2-eth0" Dec 16 12:16:07.776553 containerd[1680]: 2025-12-16 12:16:07.755 [INFO][4429] cni-plugin/k8s.go 418: Populated endpoint ContainerID="a6bed098ccff0ce5c8addd1958d825b0676a222e583f34abee366fd2f50c505e" Namespace="kube-system" Pod="coredns-668d6bf9bc-n6pr2" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-coredns--668d6bf9bc--n6pr2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--0--5b424f63c8-k8s-coredns--668d6bf9bc--n6pr2-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"4f3aa479-c591-4d08-8703-ea7e260802a6", ResourceVersion:"849", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 15, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-0-5b424f63c8", ContainerID:"", Pod:"coredns-668d6bf9bc-n6pr2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.67.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali796a623e354", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:16:07.776553 containerd[1680]: 2025-12-16 12:16:07.755 [INFO][4429] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.67.66/32] ContainerID="a6bed098ccff0ce5c8addd1958d825b0676a222e583f34abee366fd2f50c505e" Namespace="kube-system" Pod="coredns-668d6bf9bc-n6pr2" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-coredns--668d6bf9bc--n6pr2-eth0" Dec 16 12:16:07.776553 containerd[1680]: 2025-12-16 12:16:07.755 [INFO][4429] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali796a623e354 ContainerID="a6bed098ccff0ce5c8addd1958d825b0676a222e583f34abee366fd2f50c505e" Namespace="kube-system" Pod="coredns-668d6bf9bc-n6pr2" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-coredns--668d6bf9bc--n6pr2-eth0" Dec 16 12:16:07.776553 containerd[1680]: 2025-12-16 12:16:07.757 [INFO][4429] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="a6bed098ccff0ce5c8addd1958d825b0676a222e583f34abee366fd2f50c505e" Namespace="kube-system" Pod="coredns-668d6bf9bc-n6pr2" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-coredns--668d6bf9bc--n6pr2-eth0" Dec 16 12:16:07.776553 containerd[1680]: 2025-12-16 12:16:07.758 [INFO][4429] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="a6bed098ccff0ce5c8addd1958d825b0676a222e583f34abee366fd2f50c505e" Namespace="kube-system" Pod="coredns-668d6bf9bc-n6pr2" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-coredns--668d6bf9bc--n6pr2-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--0--5b424f63c8-k8s-coredns--668d6bf9bc--n6pr2-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"4f3aa479-c591-4d08-8703-ea7e260802a6", ResourceVersion:"849", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 15, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-0-5b424f63c8", ContainerID:"a6bed098ccff0ce5c8addd1958d825b0676a222e583f34abee366fd2f50c505e", Pod:"coredns-668d6bf9bc-n6pr2", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.67.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali796a623e354", MAC:"26:a2:57:ef:0b:27", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:16:07.776553 containerd[1680]: 2025-12-16 12:16:07.771 [INFO][4429] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="a6bed098ccff0ce5c8addd1958d825b0676a222e583f34abee366fd2f50c505e" Namespace="kube-system" Pod="coredns-668d6bf9bc-n6pr2" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-coredns--668d6bf9bc--n6pr2-eth0" Dec 16 12:16:07.786000 audit[4461]: NETFILTER_CFG table=filter:127 family=2 entries=42 op=nft_register_chain pid=4461 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:16:07.786000 audit[4461]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=22552 a0=3 a1=ffffd5afc7d0 a2=0 a3=ffffa7f6afa8 items=0 ppid=4238 pid=4461 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:07.786000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:16:07.804944 containerd[1680]: time="2025-12-16T12:16:07.804887435Z" level=info msg="connecting to shim a6bed098ccff0ce5c8addd1958d825b0676a222e583f34abee366fd2f50c505e" address="unix:///run/containerd/s/9f272e5d2e6969b0a54dbdd45b11382edbcd32c3c5793fd9edcce664bd8faaa6" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:16:07.833289 systemd[1]: Started cri-containerd-a6bed098ccff0ce5c8addd1958d825b0676a222e583f34abee366fd2f50c505e.scope - libcontainer container a6bed098ccff0ce5c8addd1958d825b0676a222e583f34abee366fd2f50c505e. Dec 16 12:16:07.842000 audit: BPF prog-id=211 op=LOAD Dec 16 12:16:07.842000 audit: BPF prog-id=212 op=LOAD Dec 16 12:16:07.842000 audit[4483]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=4470 pid=4483 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:07.842000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6136626564303938636366663063653563386164646431393538643832 Dec 16 12:16:07.842000 audit: BPF prog-id=212 op=UNLOAD Dec 16 12:16:07.842000 audit[4483]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4470 pid=4483 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:07.842000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6136626564303938636366663063653563386164646431393538643832 Dec 16 12:16:07.843000 audit: BPF prog-id=213 op=LOAD Dec 16 12:16:07.843000 audit[4483]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=4470 pid=4483 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:07.843000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6136626564303938636366663063653563386164646431393538643832 Dec 16 12:16:07.843000 audit: BPF prog-id=214 op=LOAD Dec 16 12:16:07.843000 audit[4483]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=4470 pid=4483 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:07.843000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6136626564303938636366663063653563386164646431393538643832 Dec 16 12:16:07.843000 audit: BPF prog-id=214 op=UNLOAD Dec 16 12:16:07.843000 audit[4483]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4470 pid=4483 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:07.843000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6136626564303938636366663063653563386164646431393538643832 Dec 16 12:16:07.843000 audit: BPF prog-id=213 op=UNLOAD Dec 16 12:16:07.843000 audit[4483]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4470 pid=4483 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:07.843000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6136626564303938636366663063653563386164646431393538643832 Dec 16 12:16:07.843000 audit: BPF prog-id=215 op=LOAD Dec 16 12:16:07.843000 audit[4483]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=4470 pid=4483 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:07.843000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6136626564303938636366663063653563386164646431393538643832 Dec 16 12:16:07.866267 containerd[1680]: time="2025-12-16T12:16:07.866233268Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-n6pr2,Uid:4f3aa479-c591-4d08-8703-ea7e260802a6,Namespace:kube-system,Attempt:0,} returns sandbox id \"a6bed098ccff0ce5c8addd1958d825b0676a222e583f34abee366fd2f50c505e\"" Dec 16 12:16:07.868492 containerd[1680]: time="2025-12-16T12:16:07.868459799Z" level=info msg="CreateContainer within sandbox \"a6bed098ccff0ce5c8addd1958d825b0676a222e583f34abee366fd2f50c505e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 16 12:16:07.877331 containerd[1680]: time="2025-12-16T12:16:07.876571201Z" level=info msg="Container c108a54dcdb280c0937d23c703f3f927ff26578d4a4ad4ab857851700e075de7: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:16:07.888062 containerd[1680]: time="2025-12-16T12:16:07.888021099Z" level=info msg="CreateContainer within sandbox \"a6bed098ccff0ce5c8addd1958d825b0676a222e583f34abee366fd2f50c505e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"c108a54dcdb280c0937d23c703f3f927ff26578d4a4ad4ab857851700e075de7\"" Dec 16 12:16:07.888545 containerd[1680]: time="2025-12-16T12:16:07.888516541Z" level=info msg="StartContainer for \"c108a54dcdb280c0937d23c703f3f927ff26578d4a4ad4ab857851700e075de7\"" Dec 16 12:16:07.890885 containerd[1680]: time="2025-12-16T12:16:07.890855193Z" level=info msg="connecting to shim c108a54dcdb280c0937d23c703f3f927ff26578d4a4ad4ab857851700e075de7" address="unix:///run/containerd/s/9f272e5d2e6969b0a54dbdd45b11382edbcd32c3c5793fd9edcce664bd8faaa6" protocol=ttrpc version=3 Dec 16 12:16:07.911280 systemd[1]: Started cri-containerd-c108a54dcdb280c0937d23c703f3f927ff26578d4a4ad4ab857851700e075de7.scope - libcontainer container c108a54dcdb280c0937d23c703f3f927ff26578d4a4ad4ab857851700e075de7. Dec 16 12:16:07.920000 audit: BPF prog-id=216 op=LOAD Dec 16 12:16:07.921000 audit: BPF prog-id=217 op=LOAD Dec 16 12:16:07.921000 audit[4509]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4470 pid=4509 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:07.921000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331303861353464636462323830633039333764323363373033663366 Dec 16 12:16:07.921000 audit: BPF prog-id=217 op=UNLOAD Dec 16 12:16:07.921000 audit[4509]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4470 pid=4509 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:07.921000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331303861353464636462323830633039333764323363373033663366 Dec 16 12:16:07.921000 audit: BPF prog-id=218 op=LOAD Dec 16 12:16:07.921000 audit[4509]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4470 pid=4509 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:07.921000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331303861353464636462323830633039333764323363373033663366 Dec 16 12:16:07.921000 audit: BPF prog-id=219 op=LOAD Dec 16 12:16:07.921000 audit[4509]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4470 pid=4509 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:07.921000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331303861353464636462323830633039333764323363373033663366 Dec 16 12:16:07.921000 audit: BPF prog-id=219 op=UNLOAD Dec 16 12:16:07.921000 audit[4509]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4470 pid=4509 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:07.921000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331303861353464636462323830633039333764323363373033663366 Dec 16 12:16:07.921000 audit: BPF prog-id=218 op=UNLOAD Dec 16 12:16:07.921000 audit[4509]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4470 pid=4509 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:07.921000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331303861353464636462323830633039333764323363373033663366 Dec 16 12:16:07.921000 audit: BPF prog-id=220 op=LOAD Dec 16 12:16:07.921000 audit[4509]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4470 pid=4509 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:07.921000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331303861353464636462323830633039333764323363373033663366 Dec 16 12:16:07.940201 containerd[1680]: time="2025-12-16T12:16:07.940107845Z" level=info msg="StartContainer for \"c108a54dcdb280c0937d23c703f3f927ff26578d4a4ad4ab857851700e075de7\" returns successfully" Dec 16 12:16:08.649593 containerd[1680]: time="2025-12-16T12:16:08.649554543Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6784c79f67-sbbxz,Uid:13366e02-1117-46c3-a880-8d8cc6c423f8,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:16:08.758064 systemd-networkd[1582]: cali48aa71c4b1e: Link UP Dec 16 12:16:08.759060 systemd-networkd[1582]: cali48aa71c4b1e: Gained carrier Dec 16 12:16:08.771818 containerd[1680]: 2025-12-16 12:16:08.691 [INFO][4545] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--6784c79f67--sbbxz-eth0 calico-apiserver-6784c79f67- calico-apiserver 13366e02-1117-46c3-a880-8d8cc6c423f8 847 0 2025-12-16 12:15:36 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6784c79f67 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4547-0-0-0-5b424f63c8 calico-apiserver-6784c79f67-sbbxz eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali48aa71c4b1e [] [] }} ContainerID="1d440765f836104f84ddc0fbeb77b946d350eedf20744120f97e2147172e18f4" Namespace="calico-apiserver" Pod="calico-apiserver-6784c79f67-sbbxz" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--6784c79f67--sbbxz-" Dec 16 12:16:08.771818 containerd[1680]: 2025-12-16 12:16:08.691 [INFO][4545] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="1d440765f836104f84ddc0fbeb77b946d350eedf20744120f97e2147172e18f4" Namespace="calico-apiserver" Pod="calico-apiserver-6784c79f67-sbbxz" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--6784c79f67--sbbxz-eth0" Dec 16 12:16:08.771818 containerd[1680]: 2025-12-16 12:16:08.715 [INFO][4560] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="1d440765f836104f84ddc0fbeb77b946d350eedf20744120f97e2147172e18f4" HandleID="k8s-pod-network.1d440765f836104f84ddc0fbeb77b946d350eedf20744120f97e2147172e18f4" Workload="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--6784c79f67--sbbxz-eth0" Dec 16 12:16:08.771818 containerd[1680]: 2025-12-16 12:16:08.716 [INFO][4560] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="1d440765f836104f84ddc0fbeb77b946d350eedf20744120f97e2147172e18f4" HandleID="k8s-pod-network.1d440765f836104f84ddc0fbeb77b946d350eedf20744120f97e2147172e18f4" Workload="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--6784c79f67--sbbxz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001374e0), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4547-0-0-0-5b424f63c8", "pod":"calico-apiserver-6784c79f67-sbbxz", "timestamp":"2025-12-16 12:16:08.715948321 +0000 UTC"}, Hostname:"ci-4547-0-0-0-5b424f63c8", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:16:08.771818 containerd[1680]: 2025-12-16 12:16:08.716 [INFO][4560] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:16:08.771818 containerd[1680]: 2025-12-16 12:16:08.716 [INFO][4560] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:16:08.771818 containerd[1680]: 2025-12-16 12:16:08.716 [INFO][4560] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-0-5b424f63c8' Dec 16 12:16:08.771818 containerd[1680]: 2025-12-16 12:16:08.726 [INFO][4560] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.1d440765f836104f84ddc0fbeb77b946d350eedf20744120f97e2147172e18f4" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:08.771818 containerd[1680]: 2025-12-16 12:16:08.730 [INFO][4560] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:08.771818 containerd[1680]: 2025-12-16 12:16:08.735 [INFO][4560] ipam/ipam.go 511: Trying affinity for 192.168.67.64/26 host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:08.771818 containerd[1680]: 2025-12-16 12:16:08.737 [INFO][4560] ipam/ipam.go 158: Attempting to load block cidr=192.168.67.64/26 host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:08.771818 containerd[1680]: 2025-12-16 12:16:08.740 [INFO][4560] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.67.64/26 host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:08.771818 containerd[1680]: 2025-12-16 12:16:08.740 [INFO][4560] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.67.64/26 handle="k8s-pod-network.1d440765f836104f84ddc0fbeb77b946d350eedf20744120f97e2147172e18f4" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:08.771818 containerd[1680]: 2025-12-16 12:16:08.741 [INFO][4560] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.1d440765f836104f84ddc0fbeb77b946d350eedf20744120f97e2147172e18f4 Dec 16 12:16:08.771818 containerd[1680]: 2025-12-16 12:16:08.746 [INFO][4560] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.67.64/26 handle="k8s-pod-network.1d440765f836104f84ddc0fbeb77b946d350eedf20744120f97e2147172e18f4" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:08.771818 containerd[1680]: 2025-12-16 12:16:08.752 [INFO][4560] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.67.67/26] block=192.168.67.64/26 handle="k8s-pod-network.1d440765f836104f84ddc0fbeb77b946d350eedf20744120f97e2147172e18f4" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:08.771818 containerd[1680]: 2025-12-16 12:16:08.752 [INFO][4560] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.67.67/26] handle="k8s-pod-network.1d440765f836104f84ddc0fbeb77b946d350eedf20744120f97e2147172e18f4" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:08.771818 containerd[1680]: 2025-12-16 12:16:08.752 [INFO][4560] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:16:08.771818 containerd[1680]: 2025-12-16 12:16:08.752 [INFO][4560] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.67.67/26] IPv6=[] ContainerID="1d440765f836104f84ddc0fbeb77b946d350eedf20744120f97e2147172e18f4" HandleID="k8s-pod-network.1d440765f836104f84ddc0fbeb77b946d350eedf20744120f97e2147172e18f4" Workload="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--6784c79f67--sbbxz-eth0" Dec 16 12:16:08.773663 containerd[1680]: 2025-12-16 12:16:08.754 [INFO][4545] cni-plugin/k8s.go 418: Populated endpoint ContainerID="1d440765f836104f84ddc0fbeb77b946d350eedf20744120f97e2147172e18f4" Namespace="calico-apiserver" Pod="calico-apiserver-6784c79f67-sbbxz" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--6784c79f67--sbbxz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--6784c79f67--sbbxz-eth0", GenerateName:"calico-apiserver-6784c79f67-", Namespace:"calico-apiserver", SelfLink:"", UID:"13366e02-1117-46c3-a880-8d8cc6c423f8", ResourceVersion:"847", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 15, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6784c79f67", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-0-5b424f63c8", ContainerID:"", Pod:"calico-apiserver-6784c79f67-sbbxz", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.67.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali48aa71c4b1e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:16:08.773663 containerd[1680]: 2025-12-16 12:16:08.755 [INFO][4545] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.67.67/32] ContainerID="1d440765f836104f84ddc0fbeb77b946d350eedf20744120f97e2147172e18f4" Namespace="calico-apiserver" Pod="calico-apiserver-6784c79f67-sbbxz" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--6784c79f67--sbbxz-eth0" Dec 16 12:16:08.773663 containerd[1680]: 2025-12-16 12:16:08.755 [INFO][4545] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali48aa71c4b1e ContainerID="1d440765f836104f84ddc0fbeb77b946d350eedf20744120f97e2147172e18f4" Namespace="calico-apiserver" Pod="calico-apiserver-6784c79f67-sbbxz" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--6784c79f67--sbbxz-eth0" Dec 16 12:16:08.773663 containerd[1680]: 2025-12-16 12:16:08.759 [INFO][4545] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="1d440765f836104f84ddc0fbeb77b946d350eedf20744120f97e2147172e18f4" Namespace="calico-apiserver" Pod="calico-apiserver-6784c79f67-sbbxz" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--6784c79f67--sbbxz-eth0" Dec 16 12:16:08.773663 containerd[1680]: 2025-12-16 12:16:08.759 [INFO][4545] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="1d440765f836104f84ddc0fbeb77b946d350eedf20744120f97e2147172e18f4" Namespace="calico-apiserver" Pod="calico-apiserver-6784c79f67-sbbxz" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--6784c79f67--sbbxz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--6784c79f67--sbbxz-eth0", GenerateName:"calico-apiserver-6784c79f67-", Namespace:"calico-apiserver", SelfLink:"", UID:"13366e02-1117-46c3-a880-8d8cc6c423f8", ResourceVersion:"847", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 15, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6784c79f67", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-0-5b424f63c8", ContainerID:"1d440765f836104f84ddc0fbeb77b946d350eedf20744120f97e2147172e18f4", Pod:"calico-apiserver-6784c79f67-sbbxz", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.67.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali48aa71c4b1e", MAC:"0a:a3:68:f8:4b:dd", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:16:08.773663 containerd[1680]: 2025-12-16 12:16:08.769 [INFO][4545] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="1d440765f836104f84ddc0fbeb77b946d350eedf20744120f97e2147172e18f4" Namespace="calico-apiserver" Pod="calico-apiserver-6784c79f67-sbbxz" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--6784c79f67--sbbxz-eth0" Dec 16 12:16:08.785000 audit[4576]: NETFILTER_CFG table=filter:128 family=2 entries=54 op=nft_register_chain pid=4576 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:16:08.787409 kernel: kauditd_printk_skb: 284 callbacks suppressed Dec 16 12:16:08.787704 kernel: audit: type=1325 audit(1765887368.785:675): table=filter:128 family=2 entries=54 op=nft_register_chain pid=4576 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:16:08.785000 audit[4576]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=29396 a0=3 a1=ffffe2d098d0 a2=0 a3=ffffa9ed2fa8 items=0 ppid=4238 pid=4576 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:08.792529 kernel: audit: type=1300 audit(1765887368.785:675): arch=c00000b7 syscall=211 success=yes exit=29396 a0=3 a1=ffffe2d098d0 a2=0 a3=ffffa9ed2fa8 items=0 ppid=4238 pid=4576 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:08.785000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:16:08.794793 kernel: audit: type=1327 audit(1765887368.785:675): proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:16:08.803770 containerd[1680]: time="2025-12-16T12:16:08.803730809Z" level=info msg="connecting to shim 1d440765f836104f84ddc0fbeb77b946d350eedf20744120f97e2147172e18f4" address="unix:///run/containerd/s/43069fcddb52a72d5bb51b5de2c3952d765c5731e03ceb4c83737312e44129b7" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:16:08.808713 kubelet[2884]: I1216 12:16:08.808613 2884 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-n6pr2" podStartSLOduration=42.808592834 podStartE2EDuration="42.808592834s" podCreationTimestamp="2025-12-16 12:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:16:08.808151711 +0000 UTC m=+48.427145171" watchObservedRunningTime="2025-12-16 12:16:08.808592834 +0000 UTC m=+48.427586294" Dec 16 12:16:08.823000 audit[4603]: NETFILTER_CFG table=filter:129 family=2 entries=20 op=nft_register_rule pid=4603 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:16:08.841316 systemd[1]: Started cri-containerd-1d440765f836104f84ddc0fbeb77b946d350eedf20744120f97e2147172e18f4.scope - libcontainer container 1d440765f836104f84ddc0fbeb77b946d350eedf20744120f97e2147172e18f4. Dec 16 12:16:08.823000 audit[4603]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffeb419e10 a2=0 a3=1 items=0 ppid=2997 pid=4603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:08.846539 kernel: audit: type=1325 audit(1765887368.823:676): table=filter:129 family=2 entries=20 op=nft_register_rule pid=4603 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:16:08.846643 kernel: audit: type=1300 audit(1765887368.823:676): arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffeb419e10 a2=0 a3=1 items=0 ppid=2997 pid=4603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:08.823000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:16:08.849580 kernel: audit: type=1327 audit(1765887368.823:676): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:16:08.849000 audit[4603]: NETFILTER_CFG table=nat:130 family=2 entries=14 op=nft_register_rule pid=4603 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:16:08.849000 audit[4603]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffeb419e10 a2=0 a3=1 items=0 ppid=2997 pid=4603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:08.857160 kernel: audit: type=1325 audit(1765887368.849:677): table=nat:130 family=2 entries=14 op=nft_register_rule pid=4603 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:16:08.857378 kernel: audit: type=1300 audit(1765887368.849:677): arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffeb419e10 a2=0 a3=1 items=0 ppid=2997 pid=4603 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:08.849000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:16:08.859548 kernel: audit: type=1327 audit(1765887368.849:677): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:16:08.864000 audit: BPF prog-id=221 op=LOAD Dec 16 12:16:08.867146 kernel: audit: type=1334 audit(1765887368.864:678): prog-id=221 op=LOAD Dec 16 12:16:08.864000 audit: BPF prog-id=222 op=LOAD Dec 16 12:16:08.864000 audit[4597]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4585 pid=4597 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:08.864000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164343430373635663833363130346638346464633066626562373762 Dec 16 12:16:08.864000 audit: BPF prog-id=222 op=UNLOAD Dec 16 12:16:08.864000 audit[4597]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4585 pid=4597 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:08.864000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164343430373635663833363130346638346464633066626562373762 Dec 16 12:16:08.865000 audit: BPF prog-id=223 op=LOAD Dec 16 12:16:08.865000 audit[4597]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4585 pid=4597 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:08.865000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164343430373635663833363130346638346464633066626562373762 Dec 16 12:16:08.865000 audit: BPF prog-id=224 op=LOAD Dec 16 12:16:08.865000 audit[4597]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4585 pid=4597 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:08.865000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164343430373635663833363130346638346464633066626562373762 Dec 16 12:16:08.866000 audit: BPF prog-id=224 op=UNLOAD Dec 16 12:16:08.866000 audit[4597]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4585 pid=4597 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:08.866000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164343430373635663833363130346638346464633066626562373762 Dec 16 12:16:08.866000 audit: BPF prog-id=223 op=UNLOAD Dec 16 12:16:08.866000 audit[4597]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4585 pid=4597 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:08.866000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164343430373635663833363130346638346464633066626562373762 Dec 16 12:16:08.866000 audit: BPF prog-id=225 op=LOAD Dec 16 12:16:08.866000 audit[4597]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4585 pid=4597 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:08.866000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3164343430373635663833363130346638346464633066626562373762 Dec 16 12:16:08.875000 audit[4618]: NETFILTER_CFG table=filter:131 family=2 entries=17 op=nft_register_rule pid=4618 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:16:08.875000 audit[4618]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffe1b72cd0 a2=0 a3=1 items=0 ppid=2997 pid=4618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:08.875000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:16:08.879000 audit[4618]: NETFILTER_CFG table=nat:132 family=2 entries=35 op=nft_register_chain pid=4618 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:16:08.879000 audit[4618]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14196 a0=3 a1=ffffe1b72cd0 a2=0 a3=1 items=0 ppid=2997 pid=4618 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:08.879000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:16:08.906328 systemd-networkd[1582]: cali796a623e354: Gained IPv6LL Dec 16 12:16:08.909550 containerd[1680]: time="2025-12-16T12:16:08.909513068Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6784c79f67-sbbxz,Uid:13366e02-1117-46c3-a880-8d8cc6c423f8,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"1d440765f836104f84ddc0fbeb77b946d350eedf20744120f97e2147172e18f4\"" Dec 16 12:16:08.912739 containerd[1680]: time="2025-12-16T12:16:08.912703845Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:16:09.257149 containerd[1680]: time="2025-12-16T12:16:09.256828200Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:09.258617 containerd[1680]: time="2025-12-16T12:16:09.258527728Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:16:09.258617 containerd[1680]: time="2025-12-16T12:16:09.258567528Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:09.258774 kubelet[2884]: E1216 12:16:09.258736 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:16:09.258827 kubelet[2884]: E1216 12:16:09.258787 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:16:09.258949 kubelet[2884]: E1216 12:16:09.258910 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l2lmg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6784c79f67-sbbxz_calico-apiserver(13366e02-1117-46c3-a880-8d8cc6c423f8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:09.260190 kubelet[2884]: E1216 12:16:09.260140 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-sbbxz" podUID="13366e02-1117-46c3-a880-8d8cc6c423f8" Dec 16 12:16:09.649293 containerd[1680]: time="2025-12-16T12:16:09.649104320Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-v5bnx,Uid:896c3574-3482-4970-a592-5c7752aa620e,Namespace:calico-system,Attempt:0,}" Dec 16 12:16:09.649293 containerd[1680]: time="2025-12-16T12:16:09.649171921Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6784c79f67-5nkhx,Uid:3cfa8afe-d370-4d42-b9ea-f53cfd764b71,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:16:09.775001 systemd-networkd[1582]: cali31ddc278611: Link UP Dec 16 12:16:09.776356 systemd-networkd[1582]: cali31ddc278611: Gained carrier Dec 16 12:16:09.799284 kubelet[2884]: E1216 12:16:09.799215 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-sbbxz" podUID="13366e02-1117-46c3-a880-8d8cc6c423f8" Dec 16 12:16:09.799674 containerd[1680]: 2025-12-16 12:16:09.702 [INFO][4637] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--6784c79f67--5nkhx-eth0 calico-apiserver-6784c79f67- calico-apiserver 3cfa8afe-d370-4d42-b9ea-f53cfd764b71 858 0 2025-12-16 12:15:36 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:6784c79f67 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4547-0-0-0-5b424f63c8 calico-apiserver-6784c79f67-5nkhx eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali31ddc278611 [] [] }} ContainerID="927efd7c5bc51ac1d7229e0d82a5b669d117e28df4ab07f61ed764e74da95e8e" Namespace="calico-apiserver" Pod="calico-apiserver-6784c79f67-5nkhx" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--6784c79f67--5nkhx-" Dec 16 12:16:09.799674 containerd[1680]: 2025-12-16 12:16:09.702 [INFO][4637] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="927efd7c5bc51ac1d7229e0d82a5b669d117e28df4ab07f61ed764e74da95e8e" Namespace="calico-apiserver" Pod="calico-apiserver-6784c79f67-5nkhx" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--6784c79f67--5nkhx-eth0" Dec 16 12:16:09.799674 containerd[1680]: 2025-12-16 12:16:09.732 [INFO][4654] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="927efd7c5bc51ac1d7229e0d82a5b669d117e28df4ab07f61ed764e74da95e8e" HandleID="k8s-pod-network.927efd7c5bc51ac1d7229e0d82a5b669d117e28df4ab07f61ed764e74da95e8e" Workload="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--6784c79f67--5nkhx-eth0" Dec 16 12:16:09.799674 containerd[1680]: 2025-12-16 12:16:09.732 [INFO][4654] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="927efd7c5bc51ac1d7229e0d82a5b669d117e28df4ab07f61ed764e74da95e8e" HandleID="k8s-pod-network.927efd7c5bc51ac1d7229e0d82a5b669d117e28df4ab07f61ed764e74da95e8e" Workload="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--6784c79f67--5nkhx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000136e60), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4547-0-0-0-5b424f63c8", "pod":"calico-apiserver-6784c79f67-5nkhx", "timestamp":"2025-12-16 12:16:09.732335105 +0000 UTC"}, Hostname:"ci-4547-0-0-0-5b424f63c8", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:16:09.799674 containerd[1680]: 2025-12-16 12:16:09.732 [INFO][4654] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:16:09.799674 containerd[1680]: 2025-12-16 12:16:09.732 [INFO][4654] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:16:09.799674 containerd[1680]: 2025-12-16 12:16:09.732 [INFO][4654] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-0-5b424f63c8' Dec 16 12:16:09.799674 containerd[1680]: 2025-12-16 12:16:09.744 [INFO][4654] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.927efd7c5bc51ac1d7229e0d82a5b669d117e28df4ab07f61ed764e74da95e8e" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:09.799674 containerd[1680]: 2025-12-16 12:16:09.749 [INFO][4654] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:09.799674 containerd[1680]: 2025-12-16 12:16:09.753 [INFO][4654] ipam/ipam.go 511: Trying affinity for 192.168.67.64/26 host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:09.799674 containerd[1680]: 2025-12-16 12:16:09.755 [INFO][4654] ipam/ipam.go 158: Attempting to load block cidr=192.168.67.64/26 host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:09.799674 containerd[1680]: 2025-12-16 12:16:09.758 [INFO][4654] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.67.64/26 host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:09.799674 containerd[1680]: 2025-12-16 12:16:09.758 [INFO][4654] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.67.64/26 handle="k8s-pod-network.927efd7c5bc51ac1d7229e0d82a5b669d117e28df4ab07f61ed764e74da95e8e" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:09.799674 containerd[1680]: 2025-12-16 12:16:09.760 [INFO][4654] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.927efd7c5bc51ac1d7229e0d82a5b669d117e28df4ab07f61ed764e74da95e8e Dec 16 12:16:09.799674 containerd[1680]: 2025-12-16 12:16:09.765 [INFO][4654] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.67.64/26 handle="k8s-pod-network.927efd7c5bc51ac1d7229e0d82a5b669d117e28df4ab07f61ed764e74da95e8e" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:09.799674 containerd[1680]: 2025-12-16 12:16:09.770 [INFO][4654] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.67.68/26] block=192.168.67.64/26 handle="k8s-pod-network.927efd7c5bc51ac1d7229e0d82a5b669d117e28df4ab07f61ed764e74da95e8e" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:09.799674 containerd[1680]: 2025-12-16 12:16:09.770 [INFO][4654] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.67.68/26] handle="k8s-pod-network.927efd7c5bc51ac1d7229e0d82a5b669d117e28df4ab07f61ed764e74da95e8e" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:09.799674 containerd[1680]: 2025-12-16 12:16:09.770 [INFO][4654] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:16:09.799674 containerd[1680]: 2025-12-16 12:16:09.770 [INFO][4654] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.67.68/26] IPv6=[] ContainerID="927efd7c5bc51ac1d7229e0d82a5b669d117e28df4ab07f61ed764e74da95e8e" HandleID="k8s-pod-network.927efd7c5bc51ac1d7229e0d82a5b669d117e28df4ab07f61ed764e74da95e8e" Workload="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--6784c79f67--5nkhx-eth0" Dec 16 12:16:09.800813 containerd[1680]: 2025-12-16 12:16:09.772 [INFO][4637] cni-plugin/k8s.go 418: Populated endpoint ContainerID="927efd7c5bc51ac1d7229e0d82a5b669d117e28df4ab07f61ed764e74da95e8e" Namespace="calico-apiserver" Pod="calico-apiserver-6784c79f67-5nkhx" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--6784c79f67--5nkhx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--6784c79f67--5nkhx-eth0", GenerateName:"calico-apiserver-6784c79f67-", Namespace:"calico-apiserver", SelfLink:"", UID:"3cfa8afe-d370-4d42-b9ea-f53cfd764b71", ResourceVersion:"858", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 15, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6784c79f67", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-0-5b424f63c8", ContainerID:"", Pod:"calico-apiserver-6784c79f67-5nkhx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.67.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali31ddc278611", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:16:09.800813 containerd[1680]: 2025-12-16 12:16:09.772 [INFO][4637] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.67.68/32] ContainerID="927efd7c5bc51ac1d7229e0d82a5b669d117e28df4ab07f61ed764e74da95e8e" Namespace="calico-apiserver" Pod="calico-apiserver-6784c79f67-5nkhx" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--6784c79f67--5nkhx-eth0" Dec 16 12:16:09.800813 containerd[1680]: 2025-12-16 12:16:09.772 [INFO][4637] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali31ddc278611 ContainerID="927efd7c5bc51ac1d7229e0d82a5b669d117e28df4ab07f61ed764e74da95e8e" Namespace="calico-apiserver" Pod="calico-apiserver-6784c79f67-5nkhx" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--6784c79f67--5nkhx-eth0" Dec 16 12:16:09.800813 containerd[1680]: 2025-12-16 12:16:09.774 [INFO][4637] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="927efd7c5bc51ac1d7229e0d82a5b669d117e28df4ab07f61ed764e74da95e8e" Namespace="calico-apiserver" Pod="calico-apiserver-6784c79f67-5nkhx" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--6784c79f67--5nkhx-eth0" Dec 16 12:16:09.800813 containerd[1680]: 2025-12-16 12:16:09.774 [INFO][4637] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="927efd7c5bc51ac1d7229e0d82a5b669d117e28df4ab07f61ed764e74da95e8e" Namespace="calico-apiserver" Pod="calico-apiserver-6784c79f67-5nkhx" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--6784c79f67--5nkhx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--6784c79f67--5nkhx-eth0", GenerateName:"calico-apiserver-6784c79f67-", Namespace:"calico-apiserver", SelfLink:"", UID:"3cfa8afe-d370-4d42-b9ea-f53cfd764b71", ResourceVersion:"858", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 15, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"6784c79f67", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-0-5b424f63c8", ContainerID:"927efd7c5bc51ac1d7229e0d82a5b669d117e28df4ab07f61ed764e74da95e8e", Pod:"calico-apiserver-6784c79f67-5nkhx", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.67.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali31ddc278611", MAC:"06:3a:01:77:c9:f1", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:16:09.800813 containerd[1680]: 2025-12-16 12:16:09.797 [INFO][4637] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="927efd7c5bc51ac1d7229e0d82a5b669d117e28df4ab07f61ed764e74da95e8e" Namespace="calico-apiserver" Pod="calico-apiserver-6784c79f67-5nkhx" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--6784c79f67--5nkhx-eth0" Dec 16 12:16:09.817000 audit[4679]: NETFILTER_CFG table=filter:133 family=2 entries=45 op=nft_register_chain pid=4679 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:16:09.817000 audit[4679]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=24264 a0=3 a1=ffffc755c8c0 a2=0 a3=ffff8ef92fa8 items=0 ppid=4238 pid=4679 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:09.817000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:16:09.831000 audit[4685]: NETFILTER_CFG table=filter:134 family=2 entries=14 op=nft_register_rule pid=4685 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:16:09.831000 audit[4685]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffe3aed650 a2=0 a3=1 items=0 ppid=2997 pid=4685 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:09.831000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:16:09.834630 containerd[1680]: time="2025-12-16T12:16:09.834588866Z" level=info msg="connecting to shim 927efd7c5bc51ac1d7229e0d82a5b669d117e28df4ab07f61ed764e74da95e8e" address="unix:///run/containerd/s/b0453ba71996547ff527b8fc9f5a76a768bd433d8b01e0f0f9ee9f4e8ac80f0d" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:16:09.837000 audit[4685]: NETFILTER_CFG table=nat:135 family=2 entries=20 op=nft_register_rule pid=4685 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:16:09.837000 audit[4685]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffe3aed650 a2=0 a3=1 items=0 ppid=2997 pid=4685 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:09.837000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:16:09.859379 systemd[1]: Started cri-containerd-927efd7c5bc51ac1d7229e0d82a5b669d117e28df4ab07f61ed764e74da95e8e.scope - libcontainer container 927efd7c5bc51ac1d7229e0d82a5b669d117e28df4ab07f61ed764e74da95e8e. Dec 16 12:16:09.869000 audit: BPF prog-id=226 op=LOAD Dec 16 12:16:09.870000 audit: BPF prog-id=227 op=LOAD Dec 16 12:16:09.870000 audit[4702]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4691 pid=4702 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:09.870000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3932376566643763356263353161633164373232396530643832613562 Dec 16 12:16:09.870000 audit: BPF prog-id=227 op=UNLOAD Dec 16 12:16:09.870000 audit[4702]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4691 pid=4702 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:09.870000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3932376566643763356263353161633164373232396530643832613562 Dec 16 12:16:09.870000 audit: BPF prog-id=228 op=LOAD Dec 16 12:16:09.870000 audit[4702]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4691 pid=4702 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:09.870000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3932376566643763356263353161633164373232396530643832613562 Dec 16 12:16:09.871000 audit: BPF prog-id=229 op=LOAD Dec 16 12:16:09.871000 audit[4702]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4691 pid=4702 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:09.871000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3932376566643763356263353161633164373232396530643832613562 Dec 16 12:16:09.871000 audit: BPF prog-id=229 op=UNLOAD Dec 16 12:16:09.871000 audit[4702]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4691 pid=4702 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:09.871000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3932376566643763356263353161633164373232396530643832613562 Dec 16 12:16:09.871000 audit: BPF prog-id=228 op=UNLOAD Dec 16 12:16:09.871000 audit[4702]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4691 pid=4702 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:09.871000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3932376566643763356263353161633164373232396530643832613562 Dec 16 12:16:09.871000 audit: BPF prog-id=230 op=LOAD Dec 16 12:16:09.871000 audit[4702]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4691 pid=4702 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:09.871000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3932376566643763356263353161633164373232396530643832613562 Dec 16 12:16:09.881581 systemd-networkd[1582]: caliab1e067817f: Link UP Dec 16 12:16:09.882181 systemd-networkd[1582]: caliab1e067817f: Gained carrier Dec 16 12:16:09.895426 containerd[1680]: 2025-12-16 12:16:09.707 [INFO][4626] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--0--5b424f63c8-k8s-goldmane--666569f655--v5bnx-eth0 goldmane-666569f655- calico-system 896c3574-3482-4970-a592-5c7752aa620e 862 0 2025-12-16 12:15:39 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4547-0-0-0-5b424f63c8 goldmane-666569f655-v5bnx eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] caliab1e067817f [] [] }} ContainerID="011b8446dd741cc6911b1cbd960b6d5d8e6fcdb8a94429085d75de2892428871" Namespace="calico-system" Pod="goldmane-666569f655-v5bnx" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-goldmane--666569f655--v5bnx-" Dec 16 12:16:09.895426 containerd[1680]: 2025-12-16 12:16:09.707 [INFO][4626] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="011b8446dd741cc6911b1cbd960b6d5d8e6fcdb8a94429085d75de2892428871" Namespace="calico-system" Pod="goldmane-666569f655-v5bnx" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-goldmane--666569f655--v5bnx-eth0" Dec 16 12:16:09.895426 containerd[1680]: 2025-12-16 12:16:09.740 [INFO][4660] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="011b8446dd741cc6911b1cbd960b6d5d8e6fcdb8a94429085d75de2892428871" HandleID="k8s-pod-network.011b8446dd741cc6911b1cbd960b6d5d8e6fcdb8a94429085d75de2892428871" Workload="ci--4547--0--0--0--5b424f63c8-k8s-goldmane--666569f655--v5bnx-eth0" Dec 16 12:16:09.895426 containerd[1680]: 2025-12-16 12:16:09.740 [INFO][4660] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="011b8446dd741cc6911b1cbd960b6d5d8e6fcdb8a94429085d75de2892428871" HandleID="k8s-pod-network.011b8446dd741cc6911b1cbd960b6d5d8e6fcdb8a94429085d75de2892428871" Workload="ci--4547--0--0--0--5b424f63c8-k8s-goldmane--666569f655--v5bnx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400051ca60), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-0-5b424f63c8", "pod":"goldmane-666569f655-v5bnx", "timestamp":"2025-12-16 12:16:09.740055624 +0000 UTC"}, Hostname:"ci-4547-0-0-0-5b424f63c8", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:16:09.895426 containerd[1680]: 2025-12-16 12:16:09.740 [INFO][4660] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:16:09.895426 containerd[1680]: 2025-12-16 12:16:09.770 [INFO][4660] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:16:09.895426 containerd[1680]: 2025-12-16 12:16:09.771 [INFO][4660] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-0-5b424f63c8' Dec 16 12:16:09.895426 containerd[1680]: 2025-12-16 12:16:09.845 [INFO][4660] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.011b8446dd741cc6911b1cbd960b6d5d8e6fcdb8a94429085d75de2892428871" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:09.895426 containerd[1680]: 2025-12-16 12:16:09.851 [INFO][4660] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:09.895426 containerd[1680]: 2025-12-16 12:16:09.856 [INFO][4660] ipam/ipam.go 511: Trying affinity for 192.168.67.64/26 host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:09.895426 containerd[1680]: 2025-12-16 12:16:09.858 [INFO][4660] ipam/ipam.go 158: Attempting to load block cidr=192.168.67.64/26 host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:09.895426 containerd[1680]: 2025-12-16 12:16:09.861 [INFO][4660] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.67.64/26 host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:09.895426 containerd[1680]: 2025-12-16 12:16:09.861 [INFO][4660] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.67.64/26 handle="k8s-pod-network.011b8446dd741cc6911b1cbd960b6d5d8e6fcdb8a94429085d75de2892428871" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:09.895426 containerd[1680]: 2025-12-16 12:16:09.863 [INFO][4660] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.011b8446dd741cc6911b1cbd960b6d5d8e6fcdb8a94429085d75de2892428871 Dec 16 12:16:09.895426 containerd[1680]: 2025-12-16 12:16:09.868 [INFO][4660] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.67.64/26 handle="k8s-pod-network.011b8446dd741cc6911b1cbd960b6d5d8e6fcdb8a94429085d75de2892428871" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:09.895426 containerd[1680]: 2025-12-16 12:16:09.877 [INFO][4660] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.67.69/26] block=192.168.67.64/26 handle="k8s-pod-network.011b8446dd741cc6911b1cbd960b6d5d8e6fcdb8a94429085d75de2892428871" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:09.895426 containerd[1680]: 2025-12-16 12:16:09.877 [INFO][4660] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.67.69/26] handle="k8s-pod-network.011b8446dd741cc6911b1cbd960b6d5d8e6fcdb8a94429085d75de2892428871" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:09.895426 containerd[1680]: 2025-12-16 12:16:09.877 [INFO][4660] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:16:09.895426 containerd[1680]: 2025-12-16 12:16:09.877 [INFO][4660] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.67.69/26] IPv6=[] ContainerID="011b8446dd741cc6911b1cbd960b6d5d8e6fcdb8a94429085d75de2892428871" HandleID="k8s-pod-network.011b8446dd741cc6911b1cbd960b6d5d8e6fcdb8a94429085d75de2892428871" Workload="ci--4547--0--0--0--5b424f63c8-k8s-goldmane--666569f655--v5bnx-eth0" Dec 16 12:16:09.898345 containerd[1680]: 2025-12-16 12:16:09.879 [INFO][4626] cni-plugin/k8s.go 418: Populated endpoint ContainerID="011b8446dd741cc6911b1cbd960b6d5d8e6fcdb8a94429085d75de2892428871" Namespace="calico-system" Pod="goldmane-666569f655-v5bnx" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-goldmane--666569f655--v5bnx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--0--5b424f63c8-k8s-goldmane--666569f655--v5bnx-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"896c3574-3482-4970-a592-5c7752aa620e", ResourceVersion:"862", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 15, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-0-5b424f63c8", ContainerID:"", Pod:"goldmane-666569f655-v5bnx", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.67.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"caliab1e067817f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:16:09.898345 containerd[1680]: 2025-12-16 12:16:09.879 [INFO][4626] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.67.69/32] ContainerID="011b8446dd741cc6911b1cbd960b6d5d8e6fcdb8a94429085d75de2892428871" Namespace="calico-system" Pod="goldmane-666569f655-v5bnx" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-goldmane--666569f655--v5bnx-eth0" Dec 16 12:16:09.898345 containerd[1680]: 2025-12-16 12:16:09.879 [INFO][4626] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to caliab1e067817f ContainerID="011b8446dd741cc6911b1cbd960b6d5d8e6fcdb8a94429085d75de2892428871" Namespace="calico-system" Pod="goldmane-666569f655-v5bnx" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-goldmane--666569f655--v5bnx-eth0" Dec 16 12:16:09.898345 containerd[1680]: 2025-12-16 12:16:09.882 [INFO][4626] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="011b8446dd741cc6911b1cbd960b6d5d8e6fcdb8a94429085d75de2892428871" Namespace="calico-system" Pod="goldmane-666569f655-v5bnx" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-goldmane--666569f655--v5bnx-eth0" Dec 16 12:16:09.898345 containerd[1680]: 2025-12-16 12:16:09.882 [INFO][4626] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="011b8446dd741cc6911b1cbd960b6d5d8e6fcdb8a94429085d75de2892428871" Namespace="calico-system" Pod="goldmane-666569f655-v5bnx" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-goldmane--666569f655--v5bnx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--0--5b424f63c8-k8s-goldmane--666569f655--v5bnx-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"896c3574-3482-4970-a592-5c7752aa620e", ResourceVersion:"862", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 15, 39, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-0-5b424f63c8", ContainerID:"011b8446dd741cc6911b1cbd960b6d5d8e6fcdb8a94429085d75de2892428871", Pod:"goldmane-666569f655-v5bnx", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.67.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"caliab1e067817f", MAC:"fe:af:a5:5a:0b:2a", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:16:09.898345 containerd[1680]: 2025-12-16 12:16:09.892 [INFO][4626] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="011b8446dd741cc6911b1cbd960b6d5d8e6fcdb8a94429085d75de2892428871" Namespace="calico-system" Pod="goldmane-666569f655-v5bnx" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-goldmane--666569f655--v5bnx-eth0" Dec 16 12:16:09.915000 audit[4736]: NETFILTER_CFG table=filter:136 family=2 entries=56 op=nft_register_chain pid=4736 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:16:09.915000 audit[4736]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=28744 a0=3 a1=fffffd925440 a2=0 a3=ffffa0d94fa8 items=0 ppid=4238 pid=4736 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:09.915000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:16:09.924593 containerd[1680]: time="2025-12-16T12:16:09.924541365Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-6784c79f67-5nkhx,Uid:3cfa8afe-d370-4d42-b9ea-f53cfd764b71,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"927efd7c5bc51ac1d7229e0d82a5b669d117e28df4ab07f61ed764e74da95e8e\"" Dec 16 12:16:09.927923 containerd[1680]: time="2025-12-16T12:16:09.927885342Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:16:09.931351 systemd-networkd[1582]: cali48aa71c4b1e: Gained IPv6LL Dec 16 12:16:09.934863 containerd[1680]: time="2025-12-16T12:16:09.934782897Z" level=info msg="connecting to shim 011b8446dd741cc6911b1cbd960b6d5d8e6fcdb8a94429085d75de2892428871" address="unix:///run/containerd/s/9e586778dcd925820373145280982158109546faa976d300071f9196e2ed0f79" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:16:09.967515 systemd[1]: Started cri-containerd-011b8446dd741cc6911b1cbd960b6d5d8e6fcdb8a94429085d75de2892428871.scope - libcontainer container 011b8446dd741cc6911b1cbd960b6d5d8e6fcdb8a94429085d75de2892428871. Dec 16 12:16:09.977000 audit: BPF prog-id=231 op=LOAD Dec 16 12:16:09.978000 audit: BPF prog-id=232 op=LOAD Dec 16 12:16:09.978000 audit[4757]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=4746 pid=4757 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:09.978000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3031316238343436646437343163633639313162316362643936306236 Dec 16 12:16:09.978000 audit: BPF prog-id=232 op=UNLOAD Dec 16 12:16:09.978000 audit[4757]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4746 pid=4757 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:09.978000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3031316238343436646437343163633639313162316362643936306236 Dec 16 12:16:09.978000 audit: BPF prog-id=233 op=LOAD Dec 16 12:16:09.978000 audit[4757]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=4746 pid=4757 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:09.978000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3031316238343436646437343163633639313162316362643936306236 Dec 16 12:16:09.979000 audit: BPF prog-id=234 op=LOAD Dec 16 12:16:09.979000 audit[4757]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=4746 pid=4757 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:09.979000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3031316238343436646437343163633639313162316362643936306236 Dec 16 12:16:09.979000 audit: BPF prog-id=234 op=UNLOAD Dec 16 12:16:09.979000 audit[4757]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4746 pid=4757 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:09.979000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3031316238343436646437343163633639313162316362643936306236 Dec 16 12:16:09.979000 audit: BPF prog-id=233 op=UNLOAD Dec 16 12:16:09.979000 audit[4757]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4746 pid=4757 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:09.979000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3031316238343436646437343163633639313162316362643936306236 Dec 16 12:16:09.979000 audit: BPF prog-id=235 op=LOAD Dec 16 12:16:09.979000 audit[4757]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=4746 pid=4757 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:09.979000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3031316238343436646437343163633639313162316362643936306236 Dec 16 12:16:10.003013 containerd[1680]: time="2025-12-16T12:16:10.002941685Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-v5bnx,Uid:896c3574-3482-4970-a592-5c7752aa620e,Namespace:calico-system,Attempt:0,} returns sandbox id \"011b8446dd741cc6911b1cbd960b6d5d8e6fcdb8a94429085d75de2892428871\"" Dec 16 12:16:10.276212 containerd[1680]: time="2025-12-16T12:16:10.275938477Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:10.278056 containerd[1680]: time="2025-12-16T12:16:10.278007407Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:16:10.278154 containerd[1680]: time="2025-12-16T12:16:10.278066248Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:10.278261 kubelet[2884]: E1216 12:16:10.278227 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:16:10.278579 kubelet[2884]: E1216 12:16:10.278271 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:16:10.278579 kubelet[2884]: E1216 12:16:10.278485 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wjrmj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6784c79f67-5nkhx_calico-apiserver(3cfa8afe-d370-4d42-b9ea-f53cfd764b71): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:10.278803 containerd[1680]: time="2025-12-16T12:16:10.278779771Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:16:10.280099 kubelet[2884]: E1216 12:16:10.280055 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-5nkhx" podUID="3cfa8afe-d370-4d42-b9ea-f53cfd764b71" Dec 16 12:16:10.615324 containerd[1680]: time="2025-12-16T12:16:10.615024006Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:10.616695 containerd[1680]: time="2025-12-16T12:16:10.616647535Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:16:10.616781 containerd[1680]: time="2025-12-16T12:16:10.616696295Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:10.616948 kubelet[2884]: E1216 12:16:10.616876 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:16:10.616948 kubelet[2884]: E1216 12:16:10.616936 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:16:10.617116 kubelet[2884]: E1216 12:16:10.617052 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p6kpv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-v5bnx_calico-system(896c3574-3482-4970-a592-5c7752aa620e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:10.618615 kubelet[2884]: E1216 12:16:10.618571 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v5bnx" podUID="896c3574-3482-4970-a592-5c7752aa620e" Dec 16 12:16:10.649725 containerd[1680]: time="2025-12-16T12:16:10.649505742Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9bb48c66-mjzv6,Uid:576d8526-5af6-453c-afc4-7ebd613c4146,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:16:10.754227 systemd-networkd[1582]: cali506d60b10a8: Link UP Dec 16 12:16:10.754890 systemd-networkd[1582]: cali506d60b10a8: Gained carrier Dec 16 12:16:10.770784 containerd[1680]: 2025-12-16 12:16:10.689 [INFO][4790] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--9bb48c66--mjzv6-eth0 calico-apiserver-9bb48c66- calico-apiserver 576d8526-5af6-453c-afc4-7ebd613c4146 852 0 2025-12-16 12:15:36 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:9bb48c66 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4547-0-0-0-5b424f63c8 calico-apiserver-9bb48c66-mjzv6 eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali506d60b10a8 [] [] }} ContainerID="c5eeb3449dd6d5dd97405a956cfa1a1728ed4c1a26c239387a2a334ee730f7a1" Namespace="calico-apiserver" Pod="calico-apiserver-9bb48c66-mjzv6" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--9bb48c66--mjzv6-" Dec 16 12:16:10.770784 containerd[1680]: 2025-12-16 12:16:10.690 [INFO][4790] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c5eeb3449dd6d5dd97405a956cfa1a1728ed4c1a26c239387a2a334ee730f7a1" Namespace="calico-apiserver" Pod="calico-apiserver-9bb48c66-mjzv6" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--9bb48c66--mjzv6-eth0" Dec 16 12:16:10.770784 containerd[1680]: 2025-12-16 12:16:10.711 [INFO][4804] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c5eeb3449dd6d5dd97405a956cfa1a1728ed4c1a26c239387a2a334ee730f7a1" HandleID="k8s-pod-network.c5eeb3449dd6d5dd97405a956cfa1a1728ed4c1a26c239387a2a334ee730f7a1" Workload="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--9bb48c66--mjzv6-eth0" Dec 16 12:16:10.770784 containerd[1680]: 2025-12-16 12:16:10.711 [INFO][4804] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="c5eeb3449dd6d5dd97405a956cfa1a1728ed4c1a26c239387a2a334ee730f7a1" HandleID="k8s-pod-network.c5eeb3449dd6d5dd97405a956cfa1a1728ed4c1a26c239387a2a334ee730f7a1" Workload="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--9bb48c66--mjzv6-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000137480), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4547-0-0-0-5b424f63c8", "pod":"calico-apiserver-9bb48c66-mjzv6", "timestamp":"2025-12-16 12:16:10.711731379 +0000 UTC"}, Hostname:"ci-4547-0-0-0-5b424f63c8", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:16:10.770784 containerd[1680]: 2025-12-16 12:16:10.711 [INFO][4804] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:16:10.770784 containerd[1680]: 2025-12-16 12:16:10.712 [INFO][4804] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:16:10.770784 containerd[1680]: 2025-12-16 12:16:10.712 [INFO][4804] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-0-5b424f63c8' Dec 16 12:16:10.770784 containerd[1680]: 2025-12-16 12:16:10.721 [INFO][4804] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c5eeb3449dd6d5dd97405a956cfa1a1728ed4c1a26c239387a2a334ee730f7a1" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:10.770784 containerd[1680]: 2025-12-16 12:16:10.725 [INFO][4804] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:10.770784 containerd[1680]: 2025-12-16 12:16:10.729 [INFO][4804] ipam/ipam.go 511: Trying affinity for 192.168.67.64/26 host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:10.770784 containerd[1680]: 2025-12-16 12:16:10.732 [INFO][4804] ipam/ipam.go 158: Attempting to load block cidr=192.168.67.64/26 host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:10.770784 containerd[1680]: 2025-12-16 12:16:10.734 [INFO][4804] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.67.64/26 host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:10.770784 containerd[1680]: 2025-12-16 12:16:10.735 [INFO][4804] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.67.64/26 handle="k8s-pod-network.c5eeb3449dd6d5dd97405a956cfa1a1728ed4c1a26c239387a2a334ee730f7a1" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:10.770784 containerd[1680]: 2025-12-16 12:16:10.736 [INFO][4804] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.c5eeb3449dd6d5dd97405a956cfa1a1728ed4c1a26c239387a2a334ee730f7a1 Dec 16 12:16:10.770784 containerd[1680]: 2025-12-16 12:16:10.741 [INFO][4804] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.67.64/26 handle="k8s-pod-network.c5eeb3449dd6d5dd97405a956cfa1a1728ed4c1a26c239387a2a334ee730f7a1" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:10.770784 containerd[1680]: 2025-12-16 12:16:10.749 [INFO][4804] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.67.70/26] block=192.168.67.64/26 handle="k8s-pod-network.c5eeb3449dd6d5dd97405a956cfa1a1728ed4c1a26c239387a2a334ee730f7a1" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:10.770784 containerd[1680]: 2025-12-16 12:16:10.749 [INFO][4804] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.67.70/26] handle="k8s-pod-network.c5eeb3449dd6d5dd97405a956cfa1a1728ed4c1a26c239387a2a334ee730f7a1" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:10.770784 containerd[1680]: 2025-12-16 12:16:10.749 [INFO][4804] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:16:10.770784 containerd[1680]: 2025-12-16 12:16:10.749 [INFO][4804] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.67.70/26] IPv6=[] ContainerID="c5eeb3449dd6d5dd97405a956cfa1a1728ed4c1a26c239387a2a334ee730f7a1" HandleID="k8s-pod-network.c5eeb3449dd6d5dd97405a956cfa1a1728ed4c1a26c239387a2a334ee730f7a1" Workload="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--9bb48c66--mjzv6-eth0" Dec 16 12:16:10.771403 containerd[1680]: 2025-12-16 12:16:10.751 [INFO][4790] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c5eeb3449dd6d5dd97405a956cfa1a1728ed4c1a26c239387a2a334ee730f7a1" Namespace="calico-apiserver" Pod="calico-apiserver-9bb48c66-mjzv6" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--9bb48c66--mjzv6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--9bb48c66--mjzv6-eth0", GenerateName:"calico-apiserver-9bb48c66-", Namespace:"calico-apiserver", SelfLink:"", UID:"576d8526-5af6-453c-afc4-7ebd613c4146", ResourceVersion:"852", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 15, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9bb48c66", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-0-5b424f63c8", ContainerID:"", Pod:"calico-apiserver-9bb48c66-mjzv6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.67.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali506d60b10a8", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:16:10.771403 containerd[1680]: 2025-12-16 12:16:10.752 [INFO][4790] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.67.70/32] ContainerID="c5eeb3449dd6d5dd97405a956cfa1a1728ed4c1a26c239387a2a334ee730f7a1" Namespace="calico-apiserver" Pod="calico-apiserver-9bb48c66-mjzv6" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--9bb48c66--mjzv6-eth0" Dec 16 12:16:10.771403 containerd[1680]: 2025-12-16 12:16:10.752 [INFO][4790] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali506d60b10a8 ContainerID="c5eeb3449dd6d5dd97405a956cfa1a1728ed4c1a26c239387a2a334ee730f7a1" Namespace="calico-apiserver" Pod="calico-apiserver-9bb48c66-mjzv6" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--9bb48c66--mjzv6-eth0" Dec 16 12:16:10.771403 containerd[1680]: 2025-12-16 12:16:10.755 [INFO][4790] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c5eeb3449dd6d5dd97405a956cfa1a1728ed4c1a26c239387a2a334ee730f7a1" Namespace="calico-apiserver" Pod="calico-apiserver-9bb48c66-mjzv6" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--9bb48c66--mjzv6-eth0" Dec 16 12:16:10.771403 containerd[1680]: 2025-12-16 12:16:10.755 [INFO][4790] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c5eeb3449dd6d5dd97405a956cfa1a1728ed4c1a26c239387a2a334ee730f7a1" Namespace="calico-apiserver" Pod="calico-apiserver-9bb48c66-mjzv6" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--9bb48c66--mjzv6-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--9bb48c66--mjzv6-eth0", GenerateName:"calico-apiserver-9bb48c66-", Namespace:"calico-apiserver", SelfLink:"", UID:"576d8526-5af6-453c-afc4-7ebd613c4146", ResourceVersion:"852", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 15, 36, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"9bb48c66", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-0-5b424f63c8", ContainerID:"c5eeb3449dd6d5dd97405a956cfa1a1728ed4c1a26c239387a2a334ee730f7a1", Pod:"calico-apiserver-9bb48c66-mjzv6", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.67.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali506d60b10a8", MAC:"9a:33:26:16:c5:5c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:16:10.771403 containerd[1680]: 2025-12-16 12:16:10.768 [INFO][4790] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c5eeb3449dd6d5dd97405a956cfa1a1728ed4c1a26c239387a2a334ee730f7a1" Namespace="calico-apiserver" Pod="calico-apiserver-9bb48c66-mjzv6" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--apiserver--9bb48c66--mjzv6-eth0" Dec 16 12:16:10.780000 audit[4820]: NETFILTER_CFG table=filter:137 family=2 entries=53 op=nft_register_chain pid=4820 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:16:10.780000 audit[4820]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=26640 a0=3 a1=ffffeb8eed20 a2=0 a3=ffffb1338fa8 items=0 ppid=4238 pid=4820 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:10.780000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:16:10.793037 containerd[1680]: time="2025-12-16T12:16:10.792978394Z" level=info msg="connecting to shim c5eeb3449dd6d5dd97405a956cfa1a1728ed4c1a26c239387a2a334ee730f7a1" address="unix:///run/containerd/s/312d4942a28ae3b7a1f6ed2848c662ec98a18d4210fb25a52d088c5a2b92a704" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:16:10.802021 kubelet[2884]: E1216 12:16:10.801969 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v5bnx" podUID="896c3574-3482-4970-a592-5c7752aa620e" Dec 16 12:16:10.803559 kubelet[2884]: E1216 12:16:10.803518 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-sbbxz" podUID="13366e02-1117-46c3-a880-8d8cc6c423f8" Dec 16 12:16:10.804187 kubelet[2884]: E1216 12:16:10.804137 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-5nkhx" podUID="3cfa8afe-d370-4d42-b9ea-f53cfd764b71" Dec 16 12:16:10.830365 systemd[1]: Started cri-containerd-c5eeb3449dd6d5dd97405a956cfa1a1728ed4c1a26c239387a2a334ee730f7a1.scope - libcontainer container c5eeb3449dd6d5dd97405a956cfa1a1728ed4c1a26c239387a2a334ee730f7a1. Dec 16 12:16:10.833000 audit[4858]: NETFILTER_CFG table=filter:138 family=2 entries=14 op=nft_register_rule pid=4858 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:16:10.833000 audit[4858]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffcf0e8e30 a2=0 a3=1 items=0 ppid=2997 pid=4858 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:10.833000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:16:10.838000 audit[4858]: NETFILTER_CFG table=nat:139 family=2 entries=20 op=nft_register_rule pid=4858 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:16:10.838000 audit[4858]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffcf0e8e30 a2=0 a3=1 items=0 ppid=2997 pid=4858 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:10.838000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:16:10.844000 audit: BPF prog-id=236 op=LOAD Dec 16 12:16:10.844000 audit: BPF prog-id=237 op=LOAD Dec 16 12:16:10.844000 audit[4842]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4829 pid=4842 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:10.844000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335656562333434396464366435646439373430356139353663666131 Dec 16 12:16:10.845000 audit: BPF prog-id=237 op=UNLOAD Dec 16 12:16:10.845000 audit[4842]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4829 pid=4842 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:10.845000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335656562333434396464366435646439373430356139353663666131 Dec 16 12:16:10.845000 audit: BPF prog-id=238 op=LOAD Dec 16 12:16:10.845000 audit[4842]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4829 pid=4842 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:10.845000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335656562333434396464366435646439373430356139353663666131 Dec 16 12:16:10.845000 audit: BPF prog-id=239 op=LOAD Dec 16 12:16:10.845000 audit[4842]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4829 pid=4842 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:10.845000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335656562333434396464366435646439373430356139353663666131 Dec 16 12:16:10.845000 audit: BPF prog-id=239 op=UNLOAD Dec 16 12:16:10.845000 audit[4842]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4829 pid=4842 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:10.845000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335656562333434396464366435646439373430356139353663666131 Dec 16 12:16:10.845000 audit: BPF prog-id=238 op=UNLOAD Dec 16 12:16:10.845000 audit[4842]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4829 pid=4842 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:10.845000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335656562333434396464366435646439373430356139353663666131 Dec 16 12:16:10.845000 audit: BPF prog-id=240 op=LOAD Dec 16 12:16:10.845000 audit[4842]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4829 pid=4842 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:10.845000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6335656562333434396464366435646439373430356139353663666131 Dec 16 12:16:10.883674 containerd[1680]: time="2025-12-16T12:16:10.883458455Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-9bb48c66-mjzv6,Uid:576d8526-5af6-453c-afc4-7ebd613c4146,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"c5eeb3449dd6d5dd97405a956cfa1a1728ed4c1a26c239387a2a334ee730f7a1\"" Dec 16 12:16:10.885352 containerd[1680]: time="2025-12-16T12:16:10.885256704Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:16:11.221600 containerd[1680]: time="2025-12-16T12:16:11.221545299Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:11.222990 containerd[1680]: time="2025-12-16T12:16:11.222942027Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:16:11.223156 containerd[1680]: time="2025-12-16T12:16:11.223033147Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:11.223270 kubelet[2884]: E1216 12:16:11.223226 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:16:11.223385 kubelet[2884]: E1216 12:16:11.223282 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:16:11.223644 kubelet[2884]: E1216 12:16:11.223520 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ckh7w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-9bb48c66-mjzv6_calico-apiserver(576d8526-5af6-453c-afc4-7ebd613c4146): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:11.224822 kubelet[2884]: E1216 12:16:11.224775 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9bb48c66-mjzv6" podUID="576d8526-5af6-453c-afc4-7ebd613c4146" Dec 16 12:16:11.530209 systemd-networkd[1582]: caliab1e067817f: Gained IPv6LL Dec 16 12:16:11.649973 containerd[1680]: time="2025-12-16T12:16:11.649741483Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-7w7l8,Uid:f1aba02a-fd77-4f59-8d87-b35648e9b9d3,Namespace:kube-system,Attempt:0,}" Dec 16 12:16:11.649973 containerd[1680]: time="2025-12-16T12:16:11.649868564Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2hkqn,Uid:504cf836-455c-42a5-8d68-245e5d4890cf,Namespace:calico-system,Attempt:0,}" Dec 16 12:16:11.649973 containerd[1680]: time="2025-12-16T12:16:11.649951644Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6b9b8c464c-4jgjb,Uid:384a06af-c494-40e9-b0a8-31b5c5a33ae4,Namespace:calico-system,Attempt:0,}" Dec 16 12:16:11.660336 systemd-networkd[1582]: cali31ddc278611: Gained IPv6LL Dec 16 12:16:11.788437 systemd-networkd[1582]: cali468f40b9322: Link UP Dec 16 12:16:11.792506 systemd-networkd[1582]: cali468f40b9322: Gained carrier Dec 16 12:16:11.806784 containerd[1680]: 2025-12-16 12:16:11.706 [INFO][4872] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--0--5b424f63c8-k8s-coredns--668d6bf9bc--7w7l8-eth0 coredns-668d6bf9bc- kube-system f1aba02a-fd77-4f59-8d87-b35648e9b9d3 857 0 2025-12-16 12:15:26 +0000 UTC map[k8s-app:kube-dns pod-template-hash:668d6bf9bc projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4547-0-0-0-5b424f63c8 coredns-668d6bf9bc-7w7l8 eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali468f40b9322 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="6346c0d25baf33d8d5e1599c431c8bf71e796cc123f7ad1d421f6208cdc7b62f" Namespace="kube-system" Pod="coredns-668d6bf9bc-7w7l8" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-coredns--668d6bf9bc--7w7l8-" Dec 16 12:16:11.806784 containerd[1680]: 2025-12-16 12:16:11.706 [INFO][4872] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="6346c0d25baf33d8d5e1599c431c8bf71e796cc123f7ad1d421f6208cdc7b62f" Namespace="kube-system" Pod="coredns-668d6bf9bc-7w7l8" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-coredns--668d6bf9bc--7w7l8-eth0" Dec 16 12:16:11.806784 containerd[1680]: 2025-12-16 12:16:11.738 [INFO][4920] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="6346c0d25baf33d8d5e1599c431c8bf71e796cc123f7ad1d421f6208cdc7b62f" HandleID="k8s-pod-network.6346c0d25baf33d8d5e1599c431c8bf71e796cc123f7ad1d421f6208cdc7b62f" Workload="ci--4547--0--0--0--5b424f63c8-k8s-coredns--668d6bf9bc--7w7l8-eth0" Dec 16 12:16:11.806784 containerd[1680]: 2025-12-16 12:16:11.739 [INFO][4920] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="6346c0d25baf33d8d5e1599c431c8bf71e796cc123f7ad1d421f6208cdc7b62f" HandleID="k8s-pod-network.6346c0d25baf33d8d5e1599c431c8bf71e796cc123f7ad1d421f6208cdc7b62f" Workload="ci--4547--0--0--0--5b424f63c8-k8s-coredns--668d6bf9bc--7w7l8-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400012e570), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4547-0-0-0-5b424f63c8", "pod":"coredns-668d6bf9bc-7w7l8", "timestamp":"2025-12-16 12:16:11.738982658 +0000 UTC"}, Hostname:"ci-4547-0-0-0-5b424f63c8", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:16:11.806784 containerd[1680]: 2025-12-16 12:16:11.739 [INFO][4920] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:16:11.806784 containerd[1680]: 2025-12-16 12:16:11.739 [INFO][4920] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:16:11.806784 containerd[1680]: 2025-12-16 12:16:11.739 [INFO][4920] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-0-5b424f63c8' Dec 16 12:16:11.806784 containerd[1680]: 2025-12-16 12:16:11.748 [INFO][4920] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.6346c0d25baf33d8d5e1599c431c8bf71e796cc123f7ad1d421f6208cdc7b62f" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:11.806784 containerd[1680]: 2025-12-16 12:16:11.753 [INFO][4920] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:11.806784 containerd[1680]: 2025-12-16 12:16:11.759 [INFO][4920] ipam/ipam.go 511: Trying affinity for 192.168.67.64/26 host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:11.806784 containerd[1680]: 2025-12-16 12:16:11.761 [INFO][4920] ipam/ipam.go 158: Attempting to load block cidr=192.168.67.64/26 host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:11.806784 containerd[1680]: 2025-12-16 12:16:11.763 [INFO][4920] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.67.64/26 host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:11.806784 containerd[1680]: 2025-12-16 12:16:11.763 [INFO][4920] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.67.64/26 handle="k8s-pod-network.6346c0d25baf33d8d5e1599c431c8bf71e796cc123f7ad1d421f6208cdc7b62f" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:11.806784 containerd[1680]: 2025-12-16 12:16:11.764 [INFO][4920] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.6346c0d25baf33d8d5e1599c431c8bf71e796cc123f7ad1d421f6208cdc7b62f Dec 16 12:16:11.806784 containerd[1680]: 2025-12-16 12:16:11.769 [INFO][4920] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.67.64/26 handle="k8s-pod-network.6346c0d25baf33d8d5e1599c431c8bf71e796cc123f7ad1d421f6208cdc7b62f" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:11.806784 containerd[1680]: 2025-12-16 12:16:11.776 [INFO][4920] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.67.71/26] block=192.168.67.64/26 handle="k8s-pod-network.6346c0d25baf33d8d5e1599c431c8bf71e796cc123f7ad1d421f6208cdc7b62f" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:11.806784 containerd[1680]: 2025-12-16 12:16:11.776 [INFO][4920] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.67.71/26] handle="k8s-pod-network.6346c0d25baf33d8d5e1599c431c8bf71e796cc123f7ad1d421f6208cdc7b62f" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:11.806784 containerd[1680]: 2025-12-16 12:16:11.776 [INFO][4920] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:16:11.806784 containerd[1680]: 2025-12-16 12:16:11.776 [INFO][4920] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.67.71/26] IPv6=[] ContainerID="6346c0d25baf33d8d5e1599c431c8bf71e796cc123f7ad1d421f6208cdc7b62f" HandleID="k8s-pod-network.6346c0d25baf33d8d5e1599c431c8bf71e796cc123f7ad1d421f6208cdc7b62f" Workload="ci--4547--0--0--0--5b424f63c8-k8s-coredns--668d6bf9bc--7w7l8-eth0" Dec 16 12:16:11.807312 containerd[1680]: 2025-12-16 12:16:11.778 [INFO][4872] cni-plugin/k8s.go 418: Populated endpoint ContainerID="6346c0d25baf33d8d5e1599c431c8bf71e796cc123f7ad1d421f6208cdc7b62f" Namespace="kube-system" Pod="coredns-668d6bf9bc-7w7l8" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-coredns--668d6bf9bc--7w7l8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--0--5b424f63c8-k8s-coredns--668d6bf9bc--7w7l8-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"f1aba02a-fd77-4f59-8d87-b35648e9b9d3", ResourceVersion:"857", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 15, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-0-5b424f63c8", ContainerID:"", Pod:"coredns-668d6bf9bc-7w7l8", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.67.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali468f40b9322", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:16:11.807312 containerd[1680]: 2025-12-16 12:16:11.778 [INFO][4872] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.67.71/32] ContainerID="6346c0d25baf33d8d5e1599c431c8bf71e796cc123f7ad1d421f6208cdc7b62f" Namespace="kube-system" Pod="coredns-668d6bf9bc-7w7l8" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-coredns--668d6bf9bc--7w7l8-eth0" Dec 16 12:16:11.807312 containerd[1680]: 2025-12-16 12:16:11.778 [INFO][4872] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali468f40b9322 ContainerID="6346c0d25baf33d8d5e1599c431c8bf71e796cc123f7ad1d421f6208cdc7b62f" Namespace="kube-system" Pod="coredns-668d6bf9bc-7w7l8" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-coredns--668d6bf9bc--7w7l8-eth0" Dec 16 12:16:11.807312 containerd[1680]: 2025-12-16 12:16:11.792 [INFO][4872] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="6346c0d25baf33d8d5e1599c431c8bf71e796cc123f7ad1d421f6208cdc7b62f" Namespace="kube-system" Pod="coredns-668d6bf9bc-7w7l8" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-coredns--668d6bf9bc--7w7l8-eth0" Dec 16 12:16:11.807312 containerd[1680]: 2025-12-16 12:16:11.793 [INFO][4872] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="6346c0d25baf33d8d5e1599c431c8bf71e796cc123f7ad1d421f6208cdc7b62f" Namespace="kube-system" Pod="coredns-668d6bf9bc-7w7l8" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-coredns--668d6bf9bc--7w7l8-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--0--5b424f63c8-k8s-coredns--668d6bf9bc--7w7l8-eth0", GenerateName:"coredns-668d6bf9bc-", Namespace:"kube-system", SelfLink:"", UID:"f1aba02a-fd77-4f59-8d87-b35648e9b9d3", ResourceVersion:"857", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 15, 26, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"668d6bf9bc", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-0-5b424f63c8", ContainerID:"6346c0d25baf33d8d5e1599c431c8bf71e796cc123f7ad1d421f6208cdc7b62f", Pod:"coredns-668d6bf9bc-7w7l8", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.67.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali468f40b9322", MAC:"f2:dd:77:f6:04:72", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:16:11.807312 containerd[1680]: 2025-12-16 12:16:11.802 [INFO][4872] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="6346c0d25baf33d8d5e1599c431c8bf71e796cc123f7ad1d421f6208cdc7b62f" Namespace="kube-system" Pod="coredns-668d6bf9bc-7w7l8" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-coredns--668d6bf9bc--7w7l8-eth0" Dec 16 12:16:11.808936 kubelet[2884]: E1216 12:16:11.808863 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9bb48c66-mjzv6" podUID="576d8526-5af6-453c-afc4-7ebd613c4146" Dec 16 12:16:11.810390 kubelet[2884]: E1216 12:16:11.808817 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v5bnx" podUID="896c3574-3482-4970-a592-5c7752aa620e" Dec 16 12:16:11.810700 kubelet[2884]: E1216 12:16:11.810497 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-5nkhx" podUID="3cfa8afe-d370-4d42-b9ea-f53cfd764b71" Dec 16 12:16:11.826000 audit[4956]: NETFILTER_CFG table=filter:140 family=2 entries=58 op=nft_register_chain pid=4956 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:16:11.826000 audit[4956]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=26760 a0=3 a1=fffff0da4c10 a2=0 a3=ffff9cd94fa8 items=0 ppid=4238 pid=4956 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:11.826000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:16:11.843030 containerd[1680]: time="2025-12-16T12:16:11.842962429Z" level=info msg="connecting to shim 6346c0d25baf33d8d5e1599c431c8bf71e796cc123f7ad1d421f6208cdc7b62f" address="unix:///run/containerd/s/3ce11860d2a13efe8ec5ef32354bf369ecfcee03ddaaa265b708ab032bad98e6" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:16:11.850343 systemd-networkd[1582]: cali506d60b10a8: Gained IPv6LL Dec 16 12:16:11.859000 audit[4978]: NETFILTER_CFG table=filter:141 family=2 entries=14 op=nft_register_rule pid=4978 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:16:11.859000 audit[4978]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffde875490 a2=0 a3=1 items=0 ppid=2997 pid=4978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:11.859000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:16:11.864000 audit[4978]: NETFILTER_CFG table=nat:142 family=2 entries=20 op=nft_register_rule pid=4978 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:16:11.864000 audit[4978]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffde875490 a2=0 a3=1 items=0 ppid=2997 pid=4978 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:11.864000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:16:11.878327 systemd[1]: Started cri-containerd-6346c0d25baf33d8d5e1599c431c8bf71e796cc123f7ad1d421f6208cdc7b62f.scope - libcontainer container 6346c0d25baf33d8d5e1599c431c8bf71e796cc123f7ad1d421f6208cdc7b62f. Dec 16 12:16:11.888000 audit: BPF prog-id=241 op=LOAD Dec 16 12:16:11.890000 audit: BPF prog-id=242 op=LOAD Dec 16 12:16:11.890000 audit[4979]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=4966 pid=4979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:11.890000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3633343663306432356261663333643864356531353939633433316338 Dec 16 12:16:11.890000 audit: BPF prog-id=242 op=UNLOAD Dec 16 12:16:11.890000 audit[4979]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4966 pid=4979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:11.890000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3633343663306432356261663333643864356531353939633433316338 Dec 16 12:16:11.890000 audit: BPF prog-id=243 op=LOAD Dec 16 12:16:11.890000 audit[4979]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=4966 pid=4979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:11.890000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3633343663306432356261663333643864356531353939633433316338 Dec 16 12:16:11.890000 audit: BPF prog-id=244 op=LOAD Dec 16 12:16:11.890000 audit[4979]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=4966 pid=4979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:11.890000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3633343663306432356261663333643864356531353939633433316338 Dec 16 12:16:11.890000 audit: BPF prog-id=244 op=UNLOAD Dec 16 12:16:11.890000 audit[4979]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4966 pid=4979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:11.890000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3633343663306432356261663333643864356531353939633433316338 Dec 16 12:16:11.890000 audit: BPF prog-id=243 op=UNLOAD Dec 16 12:16:11.890000 audit[4979]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4966 pid=4979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:11.890000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3633343663306432356261663333643864356531353939633433316338 Dec 16 12:16:11.890000 audit: BPF prog-id=245 op=LOAD Dec 16 12:16:11.890000 audit[4979]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=4966 pid=4979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:11.890000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3633343663306432356261663333643864356531353939633433316338 Dec 16 12:16:11.902199 systemd-networkd[1582]: calic3be3474167: Link UP Dec 16 12:16:11.903367 systemd-networkd[1582]: calic3be3474167: Gained carrier Dec 16 12:16:11.919829 containerd[1680]: 2025-12-16 12:16:11.711 [INFO][4885] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--0--5b424f63c8-k8s-csi--node--driver--2hkqn-eth0 csi-node-driver- calico-system 504cf836-455c-42a5-8d68-245e5d4890cf 736 0 2025-12-16 12:15:41 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4547-0-0-0-5b424f63c8 csi-node-driver-2hkqn eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] calic3be3474167 [] [] }} ContainerID="bbec54cb0b6c1e02cd525c135c782fdbd1ea555f8c74ea48e5b42628befaeb0c" Namespace="calico-system" Pod="csi-node-driver-2hkqn" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-csi--node--driver--2hkqn-" Dec 16 12:16:11.919829 containerd[1680]: 2025-12-16 12:16:11.711 [INFO][4885] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bbec54cb0b6c1e02cd525c135c782fdbd1ea555f8c74ea48e5b42628befaeb0c" Namespace="calico-system" Pod="csi-node-driver-2hkqn" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-csi--node--driver--2hkqn-eth0" Dec 16 12:16:11.919829 containerd[1680]: 2025-12-16 12:16:11.740 [INFO][4927] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bbec54cb0b6c1e02cd525c135c782fdbd1ea555f8c74ea48e5b42628befaeb0c" HandleID="k8s-pod-network.bbec54cb0b6c1e02cd525c135c782fdbd1ea555f8c74ea48e5b42628befaeb0c" Workload="ci--4547--0--0--0--5b424f63c8-k8s-csi--node--driver--2hkqn-eth0" Dec 16 12:16:11.919829 containerd[1680]: 2025-12-16 12:16:11.741 [INFO][4927] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="bbec54cb0b6c1e02cd525c135c782fdbd1ea555f8c74ea48e5b42628befaeb0c" HandleID="k8s-pod-network.bbec54cb0b6c1e02cd525c135c782fdbd1ea555f8c74ea48e5b42628befaeb0c" Workload="ci--4547--0--0--0--5b424f63c8-k8s-csi--node--driver--2hkqn-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004d460), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-0-5b424f63c8", "pod":"csi-node-driver-2hkqn", "timestamp":"2025-12-16 12:16:11.740844868 +0000 UTC"}, Hostname:"ci-4547-0-0-0-5b424f63c8", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:16:11.919829 containerd[1680]: 2025-12-16 12:16:11.741 [INFO][4927] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:16:11.919829 containerd[1680]: 2025-12-16 12:16:11.776 [INFO][4927] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:16:11.919829 containerd[1680]: 2025-12-16 12:16:11.776 [INFO][4927] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-0-5b424f63c8' Dec 16 12:16:11.919829 containerd[1680]: 2025-12-16 12:16:11.851 [INFO][4927] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bbec54cb0b6c1e02cd525c135c782fdbd1ea555f8c74ea48e5b42628befaeb0c" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:11.919829 containerd[1680]: 2025-12-16 12:16:11.862 [INFO][4927] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:11.919829 containerd[1680]: 2025-12-16 12:16:11.869 [INFO][4927] ipam/ipam.go 511: Trying affinity for 192.168.67.64/26 host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:11.919829 containerd[1680]: 2025-12-16 12:16:11.873 [INFO][4927] ipam/ipam.go 158: Attempting to load block cidr=192.168.67.64/26 host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:11.919829 containerd[1680]: 2025-12-16 12:16:11.876 [INFO][4927] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.67.64/26 host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:11.919829 containerd[1680]: 2025-12-16 12:16:11.876 [INFO][4927] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.67.64/26 handle="k8s-pod-network.bbec54cb0b6c1e02cd525c135c782fdbd1ea555f8c74ea48e5b42628befaeb0c" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:11.919829 containerd[1680]: 2025-12-16 12:16:11.877 [INFO][4927] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.bbec54cb0b6c1e02cd525c135c782fdbd1ea555f8c74ea48e5b42628befaeb0c Dec 16 12:16:11.919829 containerd[1680]: 2025-12-16 12:16:11.887 [INFO][4927] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.67.64/26 handle="k8s-pod-network.bbec54cb0b6c1e02cd525c135c782fdbd1ea555f8c74ea48e5b42628befaeb0c" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:11.919829 containerd[1680]: 2025-12-16 12:16:11.894 [INFO][4927] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.67.72/26] block=192.168.67.64/26 handle="k8s-pod-network.bbec54cb0b6c1e02cd525c135c782fdbd1ea555f8c74ea48e5b42628befaeb0c" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:11.919829 containerd[1680]: 2025-12-16 12:16:11.894 [INFO][4927] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.67.72/26] handle="k8s-pod-network.bbec54cb0b6c1e02cd525c135c782fdbd1ea555f8c74ea48e5b42628befaeb0c" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:11.919829 containerd[1680]: 2025-12-16 12:16:11.894 [INFO][4927] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:16:11.919829 containerd[1680]: 2025-12-16 12:16:11.894 [INFO][4927] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.67.72/26] IPv6=[] ContainerID="bbec54cb0b6c1e02cd525c135c782fdbd1ea555f8c74ea48e5b42628befaeb0c" HandleID="k8s-pod-network.bbec54cb0b6c1e02cd525c135c782fdbd1ea555f8c74ea48e5b42628befaeb0c" Workload="ci--4547--0--0--0--5b424f63c8-k8s-csi--node--driver--2hkqn-eth0" Dec 16 12:16:11.920566 containerd[1680]: 2025-12-16 12:16:11.899 [INFO][4885] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bbec54cb0b6c1e02cd525c135c782fdbd1ea555f8c74ea48e5b42628befaeb0c" Namespace="calico-system" Pod="csi-node-driver-2hkqn" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-csi--node--driver--2hkqn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--0--5b424f63c8-k8s-csi--node--driver--2hkqn-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"504cf836-455c-42a5-8d68-245e5d4890cf", ResourceVersion:"736", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 15, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-0-5b424f63c8", ContainerID:"", Pod:"csi-node-driver-2hkqn", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.67.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calic3be3474167", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:16:11.920566 containerd[1680]: 2025-12-16 12:16:11.899 [INFO][4885] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.67.72/32] ContainerID="bbec54cb0b6c1e02cd525c135c782fdbd1ea555f8c74ea48e5b42628befaeb0c" Namespace="calico-system" Pod="csi-node-driver-2hkqn" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-csi--node--driver--2hkqn-eth0" Dec 16 12:16:11.920566 containerd[1680]: 2025-12-16 12:16:11.899 [INFO][4885] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calic3be3474167 ContainerID="bbec54cb0b6c1e02cd525c135c782fdbd1ea555f8c74ea48e5b42628befaeb0c" Namespace="calico-system" Pod="csi-node-driver-2hkqn" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-csi--node--driver--2hkqn-eth0" Dec 16 12:16:11.920566 containerd[1680]: 2025-12-16 12:16:11.902 [INFO][4885] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bbec54cb0b6c1e02cd525c135c782fdbd1ea555f8c74ea48e5b42628befaeb0c" Namespace="calico-system" Pod="csi-node-driver-2hkqn" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-csi--node--driver--2hkqn-eth0" Dec 16 12:16:11.920566 containerd[1680]: 2025-12-16 12:16:11.903 [INFO][4885] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bbec54cb0b6c1e02cd525c135c782fdbd1ea555f8c74ea48e5b42628befaeb0c" Namespace="calico-system" Pod="csi-node-driver-2hkqn" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-csi--node--driver--2hkqn-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--0--5b424f63c8-k8s-csi--node--driver--2hkqn-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"504cf836-455c-42a5-8d68-245e5d4890cf", ResourceVersion:"736", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 15, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-0-5b424f63c8", ContainerID:"bbec54cb0b6c1e02cd525c135c782fdbd1ea555f8c74ea48e5b42628befaeb0c", Pod:"csi-node-driver-2hkqn", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.67.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"calic3be3474167", MAC:"f2:62:6c:74:ca:32", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:16:11.920566 containerd[1680]: 2025-12-16 12:16:11.916 [INFO][4885] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bbec54cb0b6c1e02cd525c135c782fdbd1ea555f8c74ea48e5b42628befaeb0c" Namespace="calico-system" Pod="csi-node-driver-2hkqn" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-csi--node--driver--2hkqn-eth0" Dec 16 12:16:11.930000 audit[5016]: NETFILTER_CFG table=filter:143 family=2 entries=56 op=nft_register_chain pid=5016 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:16:11.930000 audit[5016]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=25500 a0=3 a1=ffffe31f7360 a2=0 a3=ffff919f1fa8 items=0 ppid=4238 pid=5016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:11.930000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:16:11.934076 containerd[1680]: time="2025-12-16T12:16:11.934019933Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-668d6bf9bc-7w7l8,Uid:f1aba02a-fd77-4f59-8d87-b35648e9b9d3,Namespace:kube-system,Attempt:0,} returns sandbox id \"6346c0d25baf33d8d5e1599c431c8bf71e796cc123f7ad1d421f6208cdc7b62f\"" Dec 16 12:16:11.939208 containerd[1680]: time="2025-12-16T12:16:11.939168399Z" level=info msg="CreateContainer within sandbox \"6346c0d25baf33d8d5e1599c431c8bf71e796cc123f7ad1d421f6208cdc7b62f\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 16 12:16:11.961488 containerd[1680]: time="2025-12-16T12:16:11.961444353Z" level=info msg="connecting to shim bbec54cb0b6c1e02cd525c135c782fdbd1ea555f8c74ea48e5b42628befaeb0c" address="unix:///run/containerd/s/d7abd2a8304d97a8de4da294b50e5149205ab7c76990262b8e7bec8d5a150793" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:16:11.961947 containerd[1680]: time="2025-12-16T12:16:11.961904315Z" level=info msg="Container 3e0ccc400049e362e957374e098fffd56b7e451d52e0a453ae1615b23c3196dc: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:16:11.973162 containerd[1680]: time="2025-12-16T12:16:11.973119652Z" level=info msg="CreateContainer within sandbox \"6346c0d25baf33d8d5e1599c431c8bf71e796cc123f7ad1d421f6208cdc7b62f\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"3e0ccc400049e362e957374e098fffd56b7e451d52e0a453ae1615b23c3196dc\"" Dec 16 12:16:11.974190 containerd[1680]: time="2025-12-16T12:16:11.974042617Z" level=info msg="StartContainer for \"3e0ccc400049e362e957374e098fffd56b7e451d52e0a453ae1615b23c3196dc\"" Dec 16 12:16:11.975304 containerd[1680]: time="2025-12-16T12:16:11.975276343Z" level=info msg="connecting to shim 3e0ccc400049e362e957374e098fffd56b7e451d52e0a453ae1615b23c3196dc" address="unix:///run/containerd/s/3ce11860d2a13efe8ec5ef32354bf369ecfcee03ddaaa265b708ab032bad98e6" protocol=ttrpc version=3 Dec 16 12:16:11.990544 systemd[1]: Started cri-containerd-bbec54cb0b6c1e02cd525c135c782fdbd1ea555f8c74ea48e5b42628befaeb0c.scope - libcontainer container bbec54cb0b6c1e02cd525c135c782fdbd1ea555f8c74ea48e5b42628befaeb0c. Dec 16 12:16:12.003237 systemd-networkd[1582]: cali377db9ea3de: Link UP Dec 16 12:16:12.003448 systemd-networkd[1582]: cali377db9ea3de: Gained carrier Dec 16 12:16:12.005262 systemd[1]: Started cri-containerd-3e0ccc400049e362e957374e098fffd56b7e451d52e0a453ae1615b23c3196dc.scope - libcontainer container 3e0ccc400049e362e957374e098fffd56b7e451d52e0a453ae1615b23c3196dc. Dec 16 12:16:12.007000 audit: BPF prog-id=246 op=LOAD Dec 16 12:16:12.010000 audit: BPF prog-id=247 op=LOAD Dec 16 12:16:12.010000 audit[5037]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=5026 pid=5037 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:12.010000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262656335346362306236633165303263643532356331333563373832 Dec 16 12:16:12.010000 audit: BPF prog-id=247 op=UNLOAD Dec 16 12:16:12.010000 audit[5037]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5026 pid=5037 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:12.010000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262656335346362306236633165303263643532356331333563373832 Dec 16 12:16:12.010000 audit: BPF prog-id=248 op=LOAD Dec 16 12:16:12.010000 audit[5037]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=5026 pid=5037 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:12.010000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262656335346362306236633165303263643532356331333563373832 Dec 16 12:16:12.010000 audit: BPF prog-id=249 op=LOAD Dec 16 12:16:12.010000 audit[5037]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a0168 a2=98 a3=0 items=0 ppid=5026 pid=5037 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:12.010000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262656335346362306236633165303263643532356331333563373832 Dec 16 12:16:12.011000 audit: BPF prog-id=249 op=UNLOAD Dec 16 12:16:12.011000 audit[5037]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=5026 pid=5037 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:12.011000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262656335346362306236633165303263643532356331333563373832 Dec 16 12:16:12.011000 audit: BPF prog-id=248 op=UNLOAD Dec 16 12:16:12.011000 audit[5037]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5026 pid=5037 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:12.011000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262656335346362306236633165303263643532356331333563373832 Dec 16 12:16:12.011000 audit: BPF prog-id=250 op=LOAD Dec 16 12:16:12.011000 audit[5037]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0648 a2=98 a3=0 items=0 ppid=5026 pid=5037 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:12.011000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6262656335346362306236633165303263643532356331333563373832 Dec 16 12:16:12.023200 containerd[1680]: 2025-12-16 12:16:11.729 [INFO][4899] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--0--5b424f63c8-k8s-calico--kube--controllers--6b9b8c464c--4jgjb-eth0 calico-kube-controllers-6b9b8c464c- calico-system 384a06af-c494-40e9-b0a8-31b5c5a33ae4 856 0 2025-12-16 12:15:41 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6b9b8c464c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4547-0-0-0-5b424f63c8 calico-kube-controllers-6b9b8c464c-4jgjb eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali377db9ea3de [] [] }} ContainerID="0d88a3dee712e787de0423b465e166a7d97d972490f919051b3bb3913b241a63" Namespace="calico-system" Pod="calico-kube-controllers-6b9b8c464c-4jgjb" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--kube--controllers--6b9b8c464c--4jgjb-" Dec 16 12:16:12.023200 containerd[1680]: 2025-12-16 12:16:11.729 [INFO][4899] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0d88a3dee712e787de0423b465e166a7d97d972490f919051b3bb3913b241a63" Namespace="calico-system" Pod="calico-kube-controllers-6b9b8c464c-4jgjb" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--kube--controllers--6b9b8c464c--4jgjb-eth0" Dec 16 12:16:12.023200 containerd[1680]: 2025-12-16 12:16:11.758 [INFO][4934] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0d88a3dee712e787de0423b465e166a7d97d972490f919051b3bb3913b241a63" HandleID="k8s-pod-network.0d88a3dee712e787de0423b465e166a7d97d972490f919051b3bb3913b241a63" Workload="ci--4547--0--0--0--5b424f63c8-k8s-calico--kube--controllers--6b9b8c464c--4jgjb-eth0" Dec 16 12:16:12.023200 containerd[1680]: 2025-12-16 12:16:11.758 [INFO][4934] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="0d88a3dee712e787de0423b465e166a7d97d972490f919051b3bb3913b241a63" HandleID="k8s-pod-network.0d88a3dee712e787de0423b465e166a7d97d972490f919051b3bb3913b241a63" Workload="ci--4547--0--0--0--5b424f63c8-k8s-calico--kube--controllers--6b9b8c464c--4jgjb-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000584590), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-0-5b424f63c8", "pod":"calico-kube-controllers-6b9b8c464c-4jgjb", "timestamp":"2025-12-16 12:16:11.758680479 +0000 UTC"}, Hostname:"ci-4547-0-0-0-5b424f63c8", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:16:12.023200 containerd[1680]: 2025-12-16 12:16:11.758 [INFO][4934] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:16:12.023200 containerd[1680]: 2025-12-16 12:16:11.898 [INFO][4934] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:16:12.023200 containerd[1680]: 2025-12-16 12:16:11.898 [INFO][4934] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-0-5b424f63c8' Dec 16 12:16:12.023200 containerd[1680]: 2025-12-16 12:16:11.956 [INFO][4934] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0d88a3dee712e787de0423b465e166a7d97d972490f919051b3bb3913b241a63" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:12.023200 containerd[1680]: 2025-12-16 12:16:11.964 [INFO][4934] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:12.023200 containerd[1680]: 2025-12-16 12:16:11.970 [INFO][4934] ipam/ipam.go 511: Trying affinity for 192.168.67.64/26 host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:12.023200 containerd[1680]: 2025-12-16 12:16:11.972 [INFO][4934] ipam/ipam.go 158: Attempting to load block cidr=192.168.67.64/26 host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:12.023200 containerd[1680]: 2025-12-16 12:16:11.978 [INFO][4934] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.67.64/26 host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:12.023200 containerd[1680]: 2025-12-16 12:16:11.979 [INFO][4934] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.67.64/26 handle="k8s-pod-network.0d88a3dee712e787de0423b465e166a7d97d972490f919051b3bb3913b241a63" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:12.023200 containerd[1680]: 2025-12-16 12:16:11.982 [INFO][4934] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.0d88a3dee712e787de0423b465e166a7d97d972490f919051b3bb3913b241a63 Dec 16 12:16:12.023200 containerd[1680]: 2025-12-16 12:16:11.986 [INFO][4934] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.67.64/26 handle="k8s-pod-network.0d88a3dee712e787de0423b465e166a7d97d972490f919051b3bb3913b241a63" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:12.023200 containerd[1680]: 2025-12-16 12:16:11.995 [INFO][4934] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.67.73/26] block=192.168.67.64/26 handle="k8s-pod-network.0d88a3dee712e787de0423b465e166a7d97d972490f919051b3bb3913b241a63" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:12.023200 containerd[1680]: 2025-12-16 12:16:11.995 [INFO][4934] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.67.73/26] handle="k8s-pod-network.0d88a3dee712e787de0423b465e166a7d97d972490f919051b3bb3913b241a63" host="ci-4547-0-0-0-5b424f63c8" Dec 16 12:16:12.023200 containerd[1680]: 2025-12-16 12:16:11.995 [INFO][4934] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:16:12.023200 containerd[1680]: 2025-12-16 12:16:11.995 [INFO][4934] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.67.73/26] IPv6=[] ContainerID="0d88a3dee712e787de0423b465e166a7d97d972490f919051b3bb3913b241a63" HandleID="k8s-pod-network.0d88a3dee712e787de0423b465e166a7d97d972490f919051b3bb3913b241a63" Workload="ci--4547--0--0--0--5b424f63c8-k8s-calico--kube--controllers--6b9b8c464c--4jgjb-eth0" Dec 16 12:16:12.024463 containerd[1680]: 2025-12-16 12:16:11.999 [INFO][4899] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0d88a3dee712e787de0423b465e166a7d97d972490f919051b3bb3913b241a63" Namespace="calico-system" Pod="calico-kube-controllers-6b9b8c464c-4jgjb" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--kube--controllers--6b9b8c464c--4jgjb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--0--5b424f63c8-k8s-calico--kube--controllers--6b9b8c464c--4jgjb-eth0", GenerateName:"calico-kube-controllers-6b9b8c464c-", Namespace:"calico-system", SelfLink:"", UID:"384a06af-c494-40e9-b0a8-31b5c5a33ae4", ResourceVersion:"856", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 15, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6b9b8c464c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-0-5b424f63c8", ContainerID:"", Pod:"calico-kube-controllers-6b9b8c464c-4jgjb", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.67.73/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali377db9ea3de", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:16:12.024463 containerd[1680]: 2025-12-16 12:16:11.999 [INFO][4899] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.67.73/32] ContainerID="0d88a3dee712e787de0423b465e166a7d97d972490f919051b3bb3913b241a63" Namespace="calico-system" Pod="calico-kube-controllers-6b9b8c464c-4jgjb" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--kube--controllers--6b9b8c464c--4jgjb-eth0" Dec 16 12:16:12.024463 containerd[1680]: 2025-12-16 12:16:11.999 [INFO][4899] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali377db9ea3de ContainerID="0d88a3dee712e787de0423b465e166a7d97d972490f919051b3bb3913b241a63" Namespace="calico-system" Pod="calico-kube-controllers-6b9b8c464c-4jgjb" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--kube--controllers--6b9b8c464c--4jgjb-eth0" Dec 16 12:16:12.024463 containerd[1680]: 2025-12-16 12:16:12.004 [INFO][4899] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0d88a3dee712e787de0423b465e166a7d97d972490f919051b3bb3913b241a63" Namespace="calico-system" Pod="calico-kube-controllers-6b9b8c464c-4jgjb" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--kube--controllers--6b9b8c464c--4jgjb-eth0" Dec 16 12:16:12.024463 containerd[1680]: 2025-12-16 12:16:12.005 [INFO][4899] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0d88a3dee712e787de0423b465e166a7d97d972490f919051b3bb3913b241a63" Namespace="calico-system" Pod="calico-kube-controllers-6b9b8c464c-4jgjb" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--kube--controllers--6b9b8c464c--4jgjb-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--0--5b424f63c8-k8s-calico--kube--controllers--6b9b8c464c--4jgjb-eth0", GenerateName:"calico-kube-controllers-6b9b8c464c-", Namespace:"calico-system", SelfLink:"", UID:"384a06af-c494-40e9-b0a8-31b5c5a33ae4", ResourceVersion:"856", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 15, 41, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6b9b8c464c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-0-5b424f63c8", ContainerID:"0d88a3dee712e787de0423b465e166a7d97d972490f919051b3bb3913b241a63", Pod:"calico-kube-controllers-6b9b8c464c-4jgjb", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.67.73/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali377db9ea3de", MAC:"d6:95:1f:ee:14:06", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:16:12.024463 containerd[1680]: 2025-12-16 12:16:12.019 [INFO][4899] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0d88a3dee712e787de0423b465e166a7d97d972490f919051b3bb3913b241a63" Namespace="calico-system" Pod="calico-kube-controllers-6b9b8c464c-4jgjb" WorkloadEndpoint="ci--4547--0--0--0--5b424f63c8-k8s-calico--kube--controllers--6b9b8c464c--4jgjb-eth0" Dec 16 12:16:12.026000 audit: BPF prog-id=251 op=LOAD Dec 16 12:16:12.028000 audit: BPF prog-id=252 op=LOAD Dec 16 12:16:12.028000 audit[5049]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4966 pid=5049 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:12.028000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3365306363633430303034396533363265393537333734653039386666 Dec 16 12:16:12.028000 audit: BPF prog-id=252 op=UNLOAD Dec 16 12:16:12.028000 audit[5049]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4966 pid=5049 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:12.028000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3365306363633430303034396533363265393537333734653039386666 Dec 16 12:16:12.028000 audit: BPF prog-id=253 op=LOAD Dec 16 12:16:12.028000 audit[5049]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4966 pid=5049 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:12.028000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3365306363633430303034396533363265393537333734653039386666 Dec 16 12:16:12.028000 audit: BPF prog-id=254 op=LOAD Dec 16 12:16:12.028000 audit[5049]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4966 pid=5049 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:12.028000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3365306363633430303034396533363265393537333734653039386666 Dec 16 12:16:12.028000 audit: BPF prog-id=254 op=UNLOAD Dec 16 12:16:12.028000 audit[5049]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4966 pid=5049 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:12.028000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3365306363633430303034396533363265393537333734653039386666 Dec 16 12:16:12.028000 audit: BPF prog-id=253 op=UNLOAD Dec 16 12:16:12.028000 audit[5049]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4966 pid=5049 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:12.028000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3365306363633430303034396533363265393537333734653039386666 Dec 16 12:16:12.028000 audit: BPF prog-id=255 op=LOAD Dec 16 12:16:12.028000 audit[5049]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4966 pid=5049 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:12.028000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3365306363633430303034396533363265393537333734653039386666 Dec 16 12:16:12.042000 audit[5086]: NETFILTER_CFG table=filter:144 family=2 entries=66 op=nft_register_chain pid=5086 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:16:12.042000 audit[5086]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=29540 a0=3 a1=fffff833b4c0 a2=0 a3=ffffb4f31fa8 items=0 ppid=4238 pid=5086 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:12.042000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:16:12.055430 containerd[1680]: time="2025-12-16T12:16:12.055298111Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-2hkqn,Uid:504cf836-455c-42a5-8d68-245e5d4890cf,Namespace:calico-system,Attempt:0,} returns sandbox id \"bbec54cb0b6c1e02cd525c135c782fdbd1ea555f8c74ea48e5b42628befaeb0c\"" Dec 16 12:16:12.057726 containerd[1680]: time="2025-12-16T12:16:12.057697724Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:16:12.061302 containerd[1680]: time="2025-12-16T12:16:12.061270262Z" level=info msg="StartContainer for \"3e0ccc400049e362e957374e098fffd56b7e451d52e0a453ae1615b23c3196dc\" returns successfully" Dec 16 12:16:12.065014 containerd[1680]: time="2025-12-16T12:16:12.064756440Z" level=info msg="connecting to shim 0d88a3dee712e787de0423b465e166a7d97d972490f919051b3bb3913b241a63" address="unix:///run/containerd/s/0e6cf36d9f1f50857e3092facbbed48842668694abf9c582f2c3ae3f72a3c05a" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:16:12.097337 systemd[1]: Started cri-containerd-0d88a3dee712e787de0423b465e166a7d97d972490f919051b3bb3913b241a63.scope - libcontainer container 0d88a3dee712e787de0423b465e166a7d97d972490f919051b3bb3913b241a63. Dec 16 12:16:12.117000 audit: BPF prog-id=256 op=LOAD Dec 16 12:16:12.117000 audit: BPF prog-id=257 op=LOAD Dec 16 12:16:12.117000 audit[5123]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=5111 pid=5123 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:12.117000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064383861336465653731326537383764653034323362343635653136 Dec 16 12:16:12.117000 audit: BPF prog-id=257 op=UNLOAD Dec 16 12:16:12.117000 audit[5123]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5111 pid=5123 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:12.117000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064383861336465653731326537383764653034323362343635653136 Dec 16 12:16:12.117000 audit: BPF prog-id=258 op=LOAD Dec 16 12:16:12.117000 audit[5123]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=5111 pid=5123 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:12.117000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064383861336465653731326537383764653034323362343635653136 Dec 16 12:16:12.117000 audit: BPF prog-id=259 op=LOAD Dec 16 12:16:12.117000 audit[5123]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=5111 pid=5123 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:12.117000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064383861336465653731326537383764653034323362343635653136 Dec 16 12:16:12.117000 audit: BPF prog-id=259 op=UNLOAD Dec 16 12:16:12.117000 audit[5123]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=5111 pid=5123 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:12.117000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064383861336465653731326537383764653034323362343635653136 Dec 16 12:16:12.117000 audit: BPF prog-id=258 op=UNLOAD Dec 16 12:16:12.117000 audit[5123]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5111 pid=5123 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:12.117000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064383861336465653731326537383764653034323362343635653136 Dec 16 12:16:12.117000 audit: BPF prog-id=260 op=LOAD Dec 16 12:16:12.117000 audit[5123]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=5111 pid=5123 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:12.117000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3064383861336465653731326537383764653034323362343635653136 Dec 16 12:16:12.149892 containerd[1680]: time="2025-12-16T12:16:12.149832354Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6b9b8c464c-4jgjb,Uid:384a06af-c494-40e9-b0a8-31b5c5a33ae4,Namespace:calico-system,Attempt:0,} returns sandbox id \"0d88a3dee712e787de0423b465e166a7d97d972490f919051b3bb3913b241a63\"" Dec 16 12:16:12.428160 containerd[1680]: time="2025-12-16T12:16:12.428063452Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:12.429807 containerd[1680]: time="2025-12-16T12:16:12.429764661Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:16:12.429900 containerd[1680]: time="2025-12-16T12:16:12.429853702Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:12.430272 kubelet[2884]: E1216 12:16:12.430052 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:16:12.430272 kubelet[2884]: E1216 12:16:12.430121 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:16:12.430385 kubelet[2884]: E1216 12:16:12.430342 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qbzzt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-2hkqn_calico-system(504cf836-455c-42a5-8d68-245e5d4890cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:12.430462 containerd[1680]: time="2025-12-16T12:16:12.430412624Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:16:12.767909 containerd[1680]: time="2025-12-16T12:16:12.767795705Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:12.769464 containerd[1680]: time="2025-12-16T12:16:12.769405953Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:16:12.769576 containerd[1680]: time="2025-12-16T12:16:12.769474674Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:12.769668 kubelet[2884]: E1216 12:16:12.769633 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:16:12.769711 kubelet[2884]: E1216 12:16:12.769680 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:16:12.770210 kubelet[2884]: E1216 12:16:12.769895 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ptdjf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-6b9b8c464c-4jgjb_calico-system(384a06af-c494-40e9-b0a8-31b5c5a33ae4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:12.770367 containerd[1680]: time="2025-12-16T12:16:12.769974836Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:16:12.771343 kubelet[2884]: E1216 12:16:12.771156 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b9b8c464c-4jgjb" podUID="384a06af-c494-40e9-b0a8-31b5c5a33ae4" Dec 16 12:16:12.813125 kubelet[2884]: E1216 12:16:12.812125 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b9b8c464c-4jgjb" podUID="384a06af-c494-40e9-b0a8-31b5c5a33ae4" Dec 16 12:16:12.816334 kubelet[2884]: E1216 12:16:12.816302 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9bb48c66-mjzv6" podUID="576d8526-5af6-453c-afc4-7ebd613c4146" Dec 16 12:16:12.838202 kubelet[2884]: I1216 12:16:12.838140 2884 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-668d6bf9bc-7w7l8" podStartSLOduration=46.838122624 podStartE2EDuration="46.838122624s" podCreationTimestamp="2025-12-16 12:15:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:16:12.836234574 +0000 UTC m=+52.455228034" watchObservedRunningTime="2025-12-16 12:16:12.838122624 +0000 UTC m=+52.457116084" Dec 16 12:16:12.850000 audit[5152]: NETFILTER_CFG table=filter:145 family=2 entries=14 op=nft_register_rule pid=5152 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:16:12.850000 audit[5152]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffe4176260 a2=0 a3=1 items=0 ppid=2997 pid=5152 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:12.850000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:16:12.858000 audit[5152]: NETFILTER_CFG table=nat:146 family=2 entries=44 op=nft_register_rule pid=5152 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:16:12.858000 audit[5152]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14196 a0=3 a1=ffffe4176260 a2=0 a3=1 items=0 ppid=2997 pid=5152 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:12.858000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:16:13.108847 containerd[1680]: time="2025-12-16T12:16:13.108716604Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:13.110518 containerd[1680]: time="2025-12-16T12:16:13.110411812Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:16:13.110518 containerd[1680]: time="2025-12-16T12:16:13.110471173Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:13.110807 kubelet[2884]: E1216 12:16:13.110616 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:16:13.110807 kubelet[2884]: E1216 12:16:13.110667 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:16:13.110807 kubelet[2884]: E1216 12:16:13.110777 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qbzzt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-2hkqn_calico-system(504cf836-455c-42a5-8d68-245e5d4890cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:13.112040 kubelet[2884]: E1216 12:16:13.111960 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-2hkqn" podUID="504cf836-455c-42a5-8d68-245e5d4890cf" Dec 16 12:16:13.386200 systemd-networkd[1582]: cali468f40b9322: Gained IPv6LL Dec 16 12:16:13.642346 systemd-networkd[1582]: calic3be3474167: Gained IPv6LL Dec 16 12:16:13.817164 kubelet[2884]: E1216 12:16:13.816619 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b9b8c464c-4jgjb" podUID="384a06af-c494-40e9-b0a8-31b5c5a33ae4" Dec 16 12:16:13.818591 kubelet[2884]: E1216 12:16:13.817600 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-2hkqn" podUID="504cf836-455c-42a5-8d68-245e5d4890cf" Dec 16 12:16:13.836208 systemd-networkd[1582]: cali377db9ea3de: Gained IPv6LL Dec 16 12:16:13.873000 audit[5154]: NETFILTER_CFG table=filter:147 family=2 entries=14 op=nft_register_rule pid=5154 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:16:13.878428 kernel: kauditd_printk_skb: 223 callbacks suppressed Dec 16 12:16:13.878471 kernel: audit: type=1325 audit(1765887373.873:758): table=filter:147 family=2 entries=14 op=nft_register_rule pid=5154 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:16:13.878497 kernel: audit: type=1300 audit(1765887373.873:758): arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffe15a6680 a2=0 a3=1 items=0 ppid=2997 pid=5154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:13.873000 audit[5154]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffe15a6680 a2=0 a3=1 items=0 ppid=2997 pid=5154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:13.873000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:16:13.883669 kernel: audit: type=1327 audit(1765887373.873:758): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:16:13.897000 audit[5154]: NETFILTER_CFG table=nat:148 family=2 entries=56 op=nft_register_chain pid=5154 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:16:13.897000 audit[5154]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=19860 a0=3 a1=ffffe15a6680 a2=0 a3=1 items=0 ppid=2997 pid=5154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:13.904094 kernel: audit: type=1325 audit(1765887373.897:759): table=nat:148 family=2 entries=56 op=nft_register_chain pid=5154 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:16:13.904149 kernel: audit: type=1300 audit(1765887373.897:759): arch=c00000b7 syscall=211 success=yes exit=19860 a0=3 a1=ffffe15a6680 a2=0 a3=1 items=0 ppid=2997 pid=5154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:16:13.904169 kernel: audit: type=1327 audit(1765887373.897:759): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:16:13.897000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:16:16.650297 containerd[1680]: time="2025-12-16T12:16:16.650236305Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:16:17.001460 containerd[1680]: time="2025-12-16T12:16:17.001411216Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:17.002645 containerd[1680]: time="2025-12-16T12:16:17.002616982Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:16:17.002722 containerd[1680]: time="2025-12-16T12:16:17.002674742Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:17.003032 kubelet[2884]: E1216 12:16:17.002861 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:16:17.003032 kubelet[2884]: E1216 12:16:17.002910 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:16:17.003389 kubelet[2884]: E1216 12:16:17.003008 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:bce81bb11dd34fd8b7a0b5197b60303d,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2s2rj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-69fc96cf55-8lskp_calico-system(9883af1d-f9ff-4212-ac38-34ecc575631c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:17.005428 containerd[1680]: time="2025-12-16T12:16:17.005398676Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:16:17.349859 containerd[1680]: time="2025-12-16T12:16:17.349733672Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:17.353157 containerd[1680]: time="2025-12-16T12:16:17.353116770Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:16:17.353331 containerd[1680]: time="2025-12-16T12:16:17.353231930Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:17.353488 kubelet[2884]: E1216 12:16:17.353436 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:16:17.353554 kubelet[2884]: E1216 12:16:17.353501 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:16:17.353657 kubelet[2884]: E1216 12:16:17.353608 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2s2rj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-69fc96cf55-8lskp_calico-system(9883af1d-f9ff-4212-ac38-34ecc575631c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:17.354833 kubelet[2884]: E1216 12:16:17.354796 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-69fc96cf55-8lskp" podUID="9883af1d-f9ff-4212-ac38-34ecc575631c" Dec 16 12:16:22.649288 containerd[1680]: time="2025-12-16T12:16:22.649223779Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:16:23.012089 containerd[1680]: time="2025-12-16T12:16:23.012016789Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:23.013904 containerd[1680]: time="2025-12-16T12:16:23.013840558Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:16:23.014004 containerd[1680]: time="2025-12-16T12:16:23.013883359Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:23.014140 kubelet[2884]: E1216 12:16:23.014092 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:16:23.014417 kubelet[2884]: E1216 12:16:23.014147 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:16:23.014417 kubelet[2884]: E1216 12:16:23.014277 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p6kpv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-v5bnx_calico-system(896c3574-3482-4970-a592-5c7752aa620e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:23.015733 kubelet[2884]: E1216 12:16:23.015624 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v5bnx" podUID="896c3574-3482-4970-a592-5c7752aa620e" Dec 16 12:16:23.650204 containerd[1680]: time="2025-12-16T12:16:23.650157924Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:16:23.995327 containerd[1680]: time="2025-12-16T12:16:23.995280844Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:23.996769 containerd[1680]: time="2025-12-16T12:16:23.996729051Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:16:23.996852 containerd[1680]: time="2025-12-16T12:16:23.996765051Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:23.997076 kubelet[2884]: E1216 12:16:23.997020 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:16:23.997139 kubelet[2884]: E1216 12:16:23.997096 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:16:23.997423 containerd[1680]: time="2025-12-16T12:16:23.997383734Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:16:23.997546 kubelet[2884]: E1216 12:16:23.997394 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l2lmg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6784c79f67-sbbxz_calico-apiserver(13366e02-1117-46c3-a880-8d8cc6c423f8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:24.000422 kubelet[2884]: E1216 12:16:24.000365 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-sbbxz" podUID="13366e02-1117-46c3-a880-8d8cc6c423f8" Dec 16 12:16:24.346652 containerd[1680]: time="2025-12-16T12:16:24.346531955Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:24.350166 containerd[1680]: time="2025-12-16T12:16:24.350114613Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:16:24.350220 containerd[1680]: time="2025-12-16T12:16:24.350175614Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:24.350588 kubelet[2884]: E1216 12:16:24.350364 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:16:24.350588 kubelet[2884]: E1216 12:16:24.350418 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:16:24.350588 kubelet[2884]: E1216 12:16:24.350533 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ckh7w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-9bb48c66-mjzv6_calico-apiserver(576d8526-5af6-453c-afc4-7ebd613c4146): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:24.351816 kubelet[2884]: E1216 12:16:24.351751 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9bb48c66-mjzv6" podUID="576d8526-5af6-453c-afc4-7ebd613c4146" Dec 16 12:16:24.651452 containerd[1680]: time="2025-12-16T12:16:24.651274629Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:16:24.977966 containerd[1680]: time="2025-12-16T12:16:24.977913295Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:24.979363 containerd[1680]: time="2025-12-16T12:16:24.979325822Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:16:24.979464 containerd[1680]: time="2025-12-16T12:16:24.979368582Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:24.979544 kubelet[2884]: E1216 12:16:24.979509 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:16:24.979585 kubelet[2884]: E1216 12:16:24.979556 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:16:24.979749 kubelet[2884]: E1216 12:16:24.979709 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wjrmj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6784c79f67-5nkhx_calico-apiserver(3cfa8afe-d370-4d42-b9ea-f53cfd764b71): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:24.981140 kubelet[2884]: E1216 12:16:24.981105 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-5nkhx" podUID="3cfa8afe-d370-4d42-b9ea-f53cfd764b71" Dec 16 12:16:26.649866 containerd[1680]: time="2025-12-16T12:16:26.649816141Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:16:27.003616 containerd[1680]: time="2025-12-16T12:16:27.003501025Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:27.005099 containerd[1680]: time="2025-12-16T12:16:27.005045073Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:16:27.005277 containerd[1680]: time="2025-12-16T12:16:27.005094553Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:27.005318 kubelet[2884]: E1216 12:16:27.005246 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:16:27.005318 kubelet[2884]: E1216 12:16:27.005293 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:16:27.005598 kubelet[2884]: E1216 12:16:27.005403 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ptdjf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-6b9b8c464c-4jgjb_calico-system(384a06af-c494-40e9-b0a8-31b5c5a33ae4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:27.006634 kubelet[2884]: E1216 12:16:27.006595 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b9b8c464c-4jgjb" podUID="384a06af-c494-40e9-b0a8-31b5c5a33ae4" Dec 16 12:16:27.649732 containerd[1680]: time="2025-12-16T12:16:27.649696161Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:16:27.966910 containerd[1680]: time="2025-12-16T12:16:27.966854018Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:27.968163 containerd[1680]: time="2025-12-16T12:16:27.968117265Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:16:27.968236 containerd[1680]: time="2025-12-16T12:16:27.968192345Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:27.968585 kubelet[2884]: E1216 12:16:27.968363 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:16:27.968585 kubelet[2884]: E1216 12:16:27.968415 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:16:27.968585 kubelet[2884]: E1216 12:16:27.968531 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qbzzt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-2hkqn_calico-system(504cf836-455c-42a5-8d68-245e5d4890cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:27.970438 containerd[1680]: time="2025-12-16T12:16:27.970354396Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:16:28.297866 containerd[1680]: time="2025-12-16T12:16:28.297743346Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:28.299317 containerd[1680]: time="2025-12-16T12:16:28.299254113Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:16:28.299390 containerd[1680]: time="2025-12-16T12:16:28.299331434Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:28.299547 kubelet[2884]: E1216 12:16:28.299514 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:16:28.300017 kubelet[2884]: E1216 12:16:28.299829 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:16:28.300017 kubelet[2884]: E1216 12:16:28.299965 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qbzzt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-2hkqn_calico-system(504cf836-455c-42a5-8d68-245e5d4890cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:28.301172 kubelet[2884]: E1216 12:16:28.301126 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-2hkqn" podUID="504cf836-455c-42a5-8d68-245e5d4890cf" Dec 16 12:16:32.650703 kubelet[2884]: E1216 12:16:32.650604 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-69fc96cf55-8lskp" podUID="9883af1d-f9ff-4212-ac38-34ecc575631c" Dec 16 12:16:34.653000 kubelet[2884]: E1216 12:16:34.652962 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v5bnx" podUID="896c3574-3482-4970-a592-5c7752aa620e" Dec 16 12:16:36.649480 kubelet[2884]: E1216 12:16:36.649433 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9bb48c66-mjzv6" podUID="576d8526-5af6-453c-afc4-7ebd613c4146" Dec 16 12:16:38.650749 kubelet[2884]: E1216 12:16:38.650303 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-sbbxz" podUID="13366e02-1117-46c3-a880-8d8cc6c423f8" Dec 16 12:16:40.649799 kubelet[2884]: E1216 12:16:40.649740 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-5nkhx" podUID="3cfa8afe-d370-4d42-b9ea-f53cfd764b71" Dec 16 12:16:41.649649 kubelet[2884]: E1216 12:16:41.649598 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b9b8c464c-4jgjb" podUID="384a06af-c494-40e9-b0a8-31b5c5a33ae4" Dec 16 12:16:43.650234 kubelet[2884]: E1216 12:16:43.650066 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-2hkqn" podUID="504cf836-455c-42a5-8d68-245e5d4890cf" Dec 16 12:16:46.652522 containerd[1680]: time="2025-12-16T12:16:46.652477232Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:16:46.984285 containerd[1680]: time="2025-12-16T12:16:46.984210124Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:46.985767 containerd[1680]: time="2025-12-16T12:16:46.985713012Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:16:46.985839 containerd[1680]: time="2025-12-16T12:16:46.985802132Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:46.986114 kubelet[2884]: E1216 12:16:46.985973 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:16:46.986401 kubelet[2884]: E1216 12:16:46.986124 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:16:46.986401 kubelet[2884]: E1216 12:16:46.986260 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p6kpv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-v5bnx_calico-system(896c3574-3482-4970-a592-5c7752aa620e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:46.987508 kubelet[2884]: E1216 12:16:46.987440 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v5bnx" podUID="896c3574-3482-4970-a592-5c7752aa620e" Dec 16 12:16:47.649515 containerd[1680]: time="2025-12-16T12:16:47.649474277Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:16:48.000614 containerd[1680]: time="2025-12-16T12:16:48.000563587Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:48.001854 containerd[1680]: time="2025-12-16T12:16:48.001812954Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:16:48.002725 containerd[1680]: time="2025-12-16T12:16:48.001886714Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:48.002801 kubelet[2884]: E1216 12:16:48.002181 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:16:48.002801 kubelet[2884]: E1216 12:16:48.002229 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:16:48.003679 kubelet[2884]: E1216 12:16:48.002337 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:bce81bb11dd34fd8b7a0b5197b60303d,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2s2rj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-69fc96cf55-8lskp_calico-system(9883af1d-f9ff-4212-ac38-34ecc575631c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:48.006257 containerd[1680]: time="2025-12-16T12:16:48.006172176Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:16:48.359971 containerd[1680]: time="2025-12-16T12:16:48.359842940Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:48.361599 containerd[1680]: time="2025-12-16T12:16:48.361550948Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:16:48.361702 containerd[1680]: time="2025-12-16T12:16:48.361583908Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:48.361840 kubelet[2884]: E1216 12:16:48.361786 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:16:48.361886 kubelet[2884]: E1216 12:16:48.361842 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:16:48.361993 kubelet[2884]: E1216 12:16:48.361946 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2s2rj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-69fc96cf55-8lskp_calico-system(9883af1d-f9ff-4212-ac38-34ecc575631c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:48.363188 kubelet[2884]: E1216 12:16:48.363126 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-69fc96cf55-8lskp" podUID="9883af1d-f9ff-4212-ac38-34ecc575631c" Dec 16 12:16:49.651855 containerd[1680]: time="2025-12-16T12:16:49.651809808Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:16:50.031582 containerd[1680]: time="2025-12-16T12:16:50.031502865Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:50.033652 containerd[1680]: time="2025-12-16T12:16:50.033576635Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:16:50.033727 containerd[1680]: time="2025-12-16T12:16:50.033694276Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:50.033919 kubelet[2884]: E1216 12:16:50.033880 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:16:50.034207 kubelet[2884]: E1216 12:16:50.033935 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:16:50.034207 kubelet[2884]: E1216 12:16:50.034057 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ckh7w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-9bb48c66-mjzv6_calico-apiserver(576d8526-5af6-453c-afc4-7ebd613c4146): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:50.035274 kubelet[2884]: E1216 12:16:50.035226 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9bb48c66-mjzv6" podUID="576d8526-5af6-453c-afc4-7ebd613c4146" Dec 16 12:16:50.650285 containerd[1680]: time="2025-12-16T12:16:50.650239540Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:16:50.992650 containerd[1680]: time="2025-12-16T12:16:50.992580286Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:50.994061 containerd[1680]: time="2025-12-16T12:16:50.994027014Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:16:50.994218 containerd[1680]: time="2025-12-16T12:16:50.994091374Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:50.994276 kubelet[2884]: E1216 12:16:50.994235 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:16:50.994330 kubelet[2884]: E1216 12:16:50.994286 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:16:50.994445 kubelet[2884]: E1216 12:16:50.994405 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l2lmg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6784c79f67-sbbxz_calico-apiserver(13366e02-1117-46c3-a880-8d8cc6c423f8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:50.995710 kubelet[2884]: E1216 12:16:50.995662 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-sbbxz" podUID="13366e02-1117-46c3-a880-8d8cc6c423f8" Dec 16 12:16:51.649718 containerd[1680]: time="2025-12-16T12:16:51.649674757Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:16:52.001846 containerd[1680]: time="2025-12-16T12:16:52.001789033Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:52.003217 containerd[1680]: time="2025-12-16T12:16:52.003178360Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:16:52.003312 containerd[1680]: time="2025-12-16T12:16:52.003282601Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:52.003470 kubelet[2884]: E1216 12:16:52.003428 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:16:52.003670 kubelet[2884]: E1216 12:16:52.003484 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:16:52.003670 kubelet[2884]: E1216 12:16:52.003594 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wjrmj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6784c79f67-5nkhx_calico-apiserver(3cfa8afe-d370-4d42-b9ea-f53cfd764b71): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:52.004755 kubelet[2884]: E1216 12:16:52.004731 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-5nkhx" podUID="3cfa8afe-d370-4d42-b9ea-f53cfd764b71" Dec 16 12:16:55.650547 containerd[1680]: time="2025-12-16T12:16:55.650338280Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:16:55.997222 containerd[1680]: time="2025-12-16T12:16:55.996770407Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:55.998572 containerd[1680]: time="2025-12-16T12:16:55.998530776Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:16:55.998745 containerd[1680]: time="2025-12-16T12:16:55.998598296Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:55.998917 kubelet[2884]: E1216 12:16:55.998857 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:16:55.999192 kubelet[2884]: E1216 12:16:55.998927 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:16:56.000121 kubelet[2884]: E1216 12:16:55.999059 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ptdjf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-6b9b8c464c-4jgjb_calico-system(384a06af-c494-40e9-b0a8-31b5c5a33ae4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:56.000562 kubelet[2884]: E1216 12:16:56.000533 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b9b8c464c-4jgjb" podUID="384a06af-c494-40e9-b0a8-31b5c5a33ae4" Dec 16 12:16:58.652092 containerd[1680]: time="2025-12-16T12:16:58.651183984Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:16:59.013463 containerd[1680]: time="2025-12-16T12:16:59.013358031Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:59.014956 containerd[1680]: time="2025-12-16T12:16:59.014664518Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:16:59.014956 containerd[1680]: time="2025-12-16T12:16:59.014739478Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:59.015117 kubelet[2884]: E1216 12:16:59.014866 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:16:59.015117 kubelet[2884]: E1216 12:16:59.014916 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:16:59.015117 kubelet[2884]: E1216 12:16:59.015022 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qbzzt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-2hkqn_calico-system(504cf836-455c-42a5-8d68-245e5d4890cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:59.018020 containerd[1680]: time="2025-12-16T12:16:59.017983255Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:16:59.348950 containerd[1680]: time="2025-12-16T12:16:59.348811022Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:59.349926 containerd[1680]: time="2025-12-16T12:16:59.349887987Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:16:59.350117 containerd[1680]: time="2025-12-16T12:16:59.349962508Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:59.350152 kubelet[2884]: E1216 12:16:59.350106 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:16:59.350187 kubelet[2884]: E1216 12:16:59.350151 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:16:59.350311 kubelet[2884]: E1216 12:16:59.350255 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qbzzt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-2hkqn_calico-system(504cf836-455c-42a5-8d68-245e5d4890cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:59.351716 kubelet[2884]: E1216 12:16:59.351680 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-2hkqn" podUID="504cf836-455c-42a5-8d68-245e5d4890cf" Dec 16 12:17:00.650337 kubelet[2884]: E1216 12:17:00.650294 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v5bnx" podUID="896c3574-3482-4970-a592-5c7752aa620e" Dec 16 12:17:01.651758 kubelet[2884]: E1216 12:17:01.651642 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-69fc96cf55-8lskp" podUID="9883af1d-f9ff-4212-ac38-34ecc575631c" Dec 16 12:17:02.650446 kubelet[2884]: E1216 12:17:02.650298 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-sbbxz" podUID="13366e02-1117-46c3-a880-8d8cc6c423f8" Dec 16 12:17:02.650909 kubelet[2884]: E1216 12:17:02.650335 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9bb48c66-mjzv6" podUID="576d8526-5af6-453c-afc4-7ebd613c4146" Dec 16 12:17:03.650487 kubelet[2884]: E1216 12:17:03.650430 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-5nkhx" podUID="3cfa8afe-d370-4d42-b9ea-f53cfd764b71" Dec 16 12:17:07.649556 kubelet[2884]: E1216 12:17:07.649448 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b9b8c464c-4jgjb" podUID="384a06af-c494-40e9-b0a8-31b5c5a33ae4" Dec 16 12:17:12.650766 kubelet[2884]: E1216 12:17:12.650672 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-2hkqn" podUID="504cf836-455c-42a5-8d68-245e5d4890cf" Dec 16 12:17:13.649672 kubelet[2884]: E1216 12:17:13.649270 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9bb48c66-mjzv6" podUID="576d8526-5af6-453c-afc4-7ebd613c4146" Dec 16 12:17:13.650062 kubelet[2884]: E1216 12:17:13.650008 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v5bnx" podUID="896c3574-3482-4970-a592-5c7752aa620e" Dec 16 12:17:16.649985 kubelet[2884]: E1216 12:17:16.649841 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-5nkhx" podUID="3cfa8afe-d370-4d42-b9ea-f53cfd764b71" Dec 16 12:17:16.649985 kubelet[2884]: E1216 12:17:16.649926 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-sbbxz" podUID="13366e02-1117-46c3-a880-8d8cc6c423f8" Dec 16 12:17:16.651189 kubelet[2884]: E1216 12:17:16.650588 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-69fc96cf55-8lskp" podUID="9883af1d-f9ff-4212-ac38-34ecc575631c" Dec 16 12:17:22.649671 kubelet[2884]: E1216 12:17:22.649602 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b9b8c464c-4jgjb" podUID="384a06af-c494-40e9-b0a8-31b5c5a33ae4" Dec 16 12:17:25.650390 kubelet[2884]: E1216 12:17:25.650335 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-2hkqn" podUID="504cf836-455c-42a5-8d68-245e5d4890cf" Dec 16 12:17:26.652512 kubelet[2884]: E1216 12:17:26.652470 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9bb48c66-mjzv6" podUID="576d8526-5af6-453c-afc4-7ebd613c4146" Dec 16 12:17:27.649162 kubelet[2884]: E1216 12:17:27.649049 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-sbbxz" podUID="13366e02-1117-46c3-a880-8d8cc6c423f8" Dec 16 12:17:27.649433 containerd[1680]: time="2025-12-16T12:17:27.649247471Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:17:28.004951 containerd[1680]: time="2025-12-16T12:17:28.004761564Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:17:28.007678 containerd[1680]: time="2025-12-16T12:17:28.007049535Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:17:28.007791 containerd[1680]: time="2025-12-16T12:17:28.007190296Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:17:28.007936 kubelet[2884]: E1216 12:17:28.007902 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:17:28.008247 kubelet[2884]: E1216 12:17:28.007998 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:17:28.008247 kubelet[2884]: E1216 12:17:28.008186 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p6kpv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-v5bnx_calico-system(896c3574-3482-4970-a592-5c7752aa620e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:17:28.009533 kubelet[2884]: E1216 12:17:28.009367 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v5bnx" podUID="896c3574-3482-4970-a592-5c7752aa620e" Dec 16 12:17:30.652794 kubelet[2884]: E1216 12:17:30.652734 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-5nkhx" podUID="3cfa8afe-d370-4d42-b9ea-f53cfd764b71" Dec 16 12:17:31.650494 containerd[1680]: time="2025-12-16T12:17:31.650446236Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:17:32.021836 containerd[1680]: time="2025-12-16T12:17:32.021739610Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:17:32.024728 containerd[1680]: time="2025-12-16T12:17:32.024643904Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:17:32.024829 containerd[1680]: time="2025-12-16T12:17:32.024757385Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:17:32.025141 kubelet[2884]: E1216 12:17:32.025029 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:17:32.025767 kubelet[2884]: E1216 12:17:32.025129 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:17:32.025767 kubelet[2884]: E1216 12:17:32.025598 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:bce81bb11dd34fd8b7a0b5197b60303d,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2s2rj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-69fc96cf55-8lskp_calico-system(9883af1d-f9ff-4212-ac38-34ecc575631c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:17:32.027659 containerd[1680]: time="2025-12-16T12:17:32.027629040Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:17:32.369039 containerd[1680]: time="2025-12-16T12:17:32.368643179Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:17:32.371227 containerd[1680]: time="2025-12-16T12:17:32.371178272Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:17:32.371437 containerd[1680]: time="2025-12-16T12:17:32.371251432Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:17:32.371612 kubelet[2884]: E1216 12:17:32.371554 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:17:32.371612 kubelet[2884]: E1216 12:17:32.371607 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:17:32.371751 kubelet[2884]: E1216 12:17:32.371714 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2s2rj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-69fc96cf55-8lskp_calico-system(9883af1d-f9ff-4212-ac38-34ecc575631c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:17:32.373054 kubelet[2884]: E1216 12:17:32.373000 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-69fc96cf55-8lskp" podUID="9883af1d-f9ff-4212-ac38-34ecc575631c" Dec 16 12:17:34.649265 kubelet[2884]: E1216 12:17:34.649219 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b9b8c464c-4jgjb" podUID="384a06af-c494-40e9-b0a8-31b5c5a33ae4" Dec 16 12:17:36.574281 systemd[1]: Started sshd@7-10.0.21.180:22-139.178.68.195:47512.service - OpenSSH per-connection server daemon (139.178.68.195:47512). Dec 16 12:17:36.573000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.21.180:22-139.178.68.195:47512 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:36.578109 kernel: audit: type=1130 audit(1765887456.573:760): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.21.180:22-139.178.68.195:47512 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:36.650379 kubelet[2884]: E1216 12:17:36.650310 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-2hkqn" podUID="504cf836-455c-42a5-8d68-245e5d4890cf" Dec 16 12:17:37.420238 sshd[5291]: Accepted publickey for core from 139.178.68.195 port 47512 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:17:37.419000 audit[5291]: USER_ACCT pid=5291 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:37.424145 kernel: audit: type=1101 audit(1765887457.419:761): pid=5291 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:37.423000 audit[5291]: CRED_ACQ pid=5291 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:37.425053 sshd-session[5291]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:17:37.429641 kernel: audit: type=1103 audit(1765887457.423:762): pid=5291 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:37.429708 kernel: audit: type=1006 audit(1765887457.423:763): pid=5291 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=9 res=1 Dec 16 12:17:37.423000 audit[5291]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc1dd8160 a2=3 a3=0 items=0 ppid=1 pid=5291 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.433512 kernel: audit: type=1300 audit(1765887457.423:763): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc1dd8160 a2=3 a3=0 items=0 ppid=1 pid=5291 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:37.435330 kernel: audit: type=1327 audit(1765887457.423:763): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:17:37.423000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:17:37.436291 systemd-logind[1646]: New session 9 of user core. Dec 16 12:17:37.443298 systemd[1]: Started session-9.scope - Session 9 of User core. Dec 16 12:17:37.446000 audit[5291]: USER_START pid=5291 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:37.449000 audit[5295]: CRED_ACQ pid=5295 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:37.453786 kernel: audit: type=1105 audit(1765887457.446:764): pid=5291 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:37.453916 kernel: audit: type=1103 audit(1765887457.449:765): pid=5295 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:37.997675 sshd[5295]: Connection closed by 139.178.68.195 port 47512 Dec 16 12:17:38.000277 sshd-session[5291]: pam_unix(sshd:session): session closed for user core Dec 16 12:17:38.001000 audit[5291]: USER_END pid=5291 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:38.005549 systemd[1]: session-9.scope: Deactivated successfully. Dec 16 12:17:38.001000 audit[5291]: CRED_DISP pid=5291 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:38.008559 systemd[1]: sshd@7-10.0.21.180:22-139.178.68.195:47512.service: Deactivated successfully. Dec 16 12:17:38.009660 kernel: audit: type=1106 audit(1765887458.001:766): pid=5291 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:38.009727 kernel: audit: type=1104 audit(1765887458.001:767): pid=5291 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:38.009000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.21.180:22-139.178.68.195:47512 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:38.012982 systemd-logind[1646]: Session 9 logged out. Waiting for processes to exit. Dec 16 12:17:38.014639 systemd-logind[1646]: Removed session 9. Dec 16 12:17:38.649988 containerd[1680]: time="2025-12-16T12:17:38.649939973Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:17:38.982417 containerd[1680]: time="2025-12-16T12:17:38.982373028Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:17:38.985051 containerd[1680]: time="2025-12-16T12:17:38.985002081Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:17:38.985195 containerd[1680]: time="2025-12-16T12:17:38.985129522Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:17:38.985379 kubelet[2884]: E1216 12:17:38.985338 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:17:38.985632 kubelet[2884]: E1216 12:17:38.985393 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:17:38.985632 kubelet[2884]: E1216 12:17:38.985518 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ckh7w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-9bb48c66-mjzv6_calico-apiserver(576d8526-5af6-453c-afc4-7ebd613c4146): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:17:38.987197 kubelet[2884]: E1216 12:17:38.987149 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9bb48c66-mjzv6" podUID="576d8526-5af6-453c-afc4-7ebd613c4146" Dec 16 12:17:39.650490 containerd[1680]: time="2025-12-16T12:17:39.650430835Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:17:39.650984 kubelet[2884]: E1216 12:17:39.650951 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v5bnx" podUID="896c3574-3482-4970-a592-5c7752aa620e" Dec 16 12:17:40.058768 containerd[1680]: time="2025-12-16T12:17:40.058651437Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:17:40.060281 containerd[1680]: time="2025-12-16T12:17:40.060230525Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:17:40.060367 containerd[1680]: time="2025-12-16T12:17:40.060273165Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:17:40.060518 kubelet[2884]: E1216 12:17:40.060468 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:17:40.060823 kubelet[2884]: E1216 12:17:40.060520 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:17:40.060823 kubelet[2884]: E1216 12:17:40.060651 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l2lmg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6784c79f67-sbbxz_calico-apiserver(13366e02-1117-46c3-a880-8d8cc6c423f8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:17:40.061856 kubelet[2884]: E1216 12:17:40.061807 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-sbbxz" podUID="13366e02-1117-46c3-a880-8d8cc6c423f8" Dec 16 12:17:43.166000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.21.180:22-139.178.68.195:55266 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:43.167273 systemd[1]: Started sshd@8-10.0.21.180:22-139.178.68.195:55266.service - OpenSSH per-connection server daemon (139.178.68.195:55266). Dec 16 12:17:43.170781 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:17:43.170845 kernel: audit: type=1130 audit(1765887463.166:769): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.21.180:22-139.178.68.195:55266 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:43.649728 containerd[1680]: time="2025-12-16T12:17:43.649614870Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:17:43.979410 containerd[1680]: time="2025-12-16T12:17:43.979355832Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:17:43.980958 containerd[1680]: time="2025-12-16T12:17:43.980901360Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:17:43.981069 containerd[1680]: time="2025-12-16T12:17:43.980989640Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:17:43.981183 kubelet[2884]: E1216 12:17:43.981142 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:17:43.981566 kubelet[2884]: E1216 12:17:43.981194 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:17:43.981566 kubelet[2884]: E1216 12:17:43.981316 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wjrmj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6784c79f67-5nkhx_calico-apiserver(3cfa8afe-d370-4d42-b9ea-f53cfd764b71): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:17:43.982489 kubelet[2884]: E1216 12:17:43.982443 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-5nkhx" podUID="3cfa8afe-d370-4d42-b9ea-f53cfd764b71" Dec 16 12:17:44.027000 audit[5339]: USER_ACCT pid=5339 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:44.028336 sshd[5339]: Accepted publickey for core from 139.178.68.195 port 55266 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:17:44.030000 audit[5339]: CRED_ACQ pid=5339 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:44.032617 sshd-session[5339]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:17:44.034613 kernel: audit: type=1101 audit(1765887464.027:770): pid=5339 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:44.034678 kernel: audit: type=1103 audit(1765887464.030:771): pid=5339 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:44.034700 kernel: audit: type=1006 audit(1765887464.030:772): pid=5339 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=10 res=1 Dec 16 12:17:44.036270 kernel: audit: type=1300 audit(1765887464.030:772): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffca534eb0 a2=3 a3=0 items=0 ppid=1 pid=5339 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:44.030000 audit[5339]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffca534eb0 a2=3 a3=0 items=0 ppid=1 pid=5339 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:44.038623 systemd-logind[1646]: New session 10 of user core. Dec 16 12:17:44.030000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:17:44.040806 kernel: audit: type=1327 audit(1765887464.030:772): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:17:44.044386 systemd[1]: Started session-10.scope - Session 10 of User core. Dec 16 12:17:44.045000 audit[5339]: USER_START pid=5339 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:44.047000 audit[5343]: CRED_ACQ pid=5343 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:44.053394 kernel: audit: type=1105 audit(1765887464.045:773): pid=5339 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:44.053502 kernel: audit: type=1103 audit(1765887464.047:774): pid=5343 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:44.582477 sshd[5343]: Connection closed by 139.178.68.195 port 55266 Dec 16 12:17:44.582909 sshd-session[5339]: pam_unix(sshd:session): session closed for user core Dec 16 12:17:44.583000 audit[5339]: USER_END pid=5339 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:44.587459 systemd[1]: sshd@8-10.0.21.180:22-139.178.68.195:55266.service: Deactivated successfully. Dec 16 12:17:44.583000 audit[5339]: CRED_DISP pid=5339 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:44.589611 systemd[1]: session-10.scope: Deactivated successfully. Dec 16 12:17:44.590412 systemd-logind[1646]: Session 10 logged out. Waiting for processes to exit. Dec 16 12:17:44.590672 kernel: audit: type=1106 audit(1765887464.583:775): pid=5339 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:44.590716 kernel: audit: type=1104 audit(1765887464.583:776): pid=5339 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:44.586000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.21.180:22-139.178.68.195:55266 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:44.591750 systemd-logind[1646]: Removed session 10. Dec 16 12:17:44.776000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.21.180:22-139.178.68.195:55274 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:44.777882 systemd[1]: Started sshd@9-10.0.21.180:22-139.178.68.195:55274.service - OpenSSH per-connection server daemon (139.178.68.195:55274). Dec 16 12:17:45.690000 audit[5358]: USER_ACCT pid=5358 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:45.692471 sshd[5358]: Accepted publickey for core from 139.178.68.195 port 55274 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:17:45.692000 audit[5358]: CRED_ACQ pid=5358 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:45.692000 audit[5358]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff63df3f0 a2=3 a3=0 items=0 ppid=1 pid=5358 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:45.692000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:17:45.694341 sshd-session[5358]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:17:45.700305 systemd-logind[1646]: New session 11 of user core. Dec 16 12:17:45.706347 systemd[1]: Started session-11.scope - Session 11 of User core. Dec 16 12:17:45.708000 audit[5358]: USER_START pid=5358 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:45.710000 audit[5362]: CRED_ACQ pid=5362 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:46.329207 sshd[5362]: Connection closed by 139.178.68.195 port 55274 Dec 16 12:17:46.329988 sshd-session[5358]: pam_unix(sshd:session): session closed for user core Dec 16 12:17:46.331000 audit[5358]: USER_END pid=5358 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:46.331000 audit[5358]: CRED_DISP pid=5358 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:46.335490 systemd-logind[1646]: Session 11 logged out. Waiting for processes to exit. Dec 16 12:17:46.335741 systemd[1]: sshd@9-10.0.21.180:22-139.178.68.195:55274.service: Deactivated successfully. Dec 16 12:17:46.334000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.21.180:22-139.178.68.195:55274 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.337968 systemd[1]: session-11.scope: Deactivated successfully. Dec 16 12:17:46.340876 systemd-logind[1646]: Removed session 11. Dec 16 12:17:46.500000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.21.180:22-139.178.68.195:55284 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:46.501250 systemd[1]: Started sshd@10-10.0.21.180:22-139.178.68.195:55284.service - OpenSSH per-connection server daemon (139.178.68.195:55284). Dec 16 12:17:47.348000 audit[5374]: USER_ACCT pid=5374 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:47.350239 sshd[5374]: Accepted publickey for core from 139.178.68.195 port 55284 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:17:47.349000 audit[5374]: CRED_ACQ pid=5374 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:47.349000 audit[5374]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe6146c00 a2=3 a3=0 items=0 ppid=1 pid=5374 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:47.349000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:17:47.351945 sshd-session[5374]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:17:47.356388 systemd-logind[1646]: New session 12 of user core. Dec 16 12:17:47.367253 systemd[1]: Started session-12.scope - Session 12 of User core. Dec 16 12:17:47.369000 audit[5374]: USER_START pid=5374 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:47.370000 audit[5378]: CRED_ACQ pid=5378 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:47.650357 containerd[1680]: time="2025-12-16T12:17:47.650244793Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:17:47.652208 kubelet[2884]: E1216 12:17:47.651785 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-69fc96cf55-8lskp" podUID="9883af1d-f9ff-4212-ac38-34ecc575631c" Dec 16 12:17:47.907738 sshd[5378]: Connection closed by 139.178.68.195 port 55284 Dec 16 12:17:47.908031 sshd-session[5374]: pam_unix(sshd:session): session closed for user core Dec 16 12:17:47.908000 audit[5374]: USER_END pid=5374 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:47.908000 audit[5374]: CRED_DISP pid=5374 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:47.912637 systemd-logind[1646]: Session 12 logged out. Waiting for processes to exit. Dec 16 12:17:47.913456 systemd[1]: sshd@10-10.0.21.180:22-139.178.68.195:55284.service: Deactivated successfully. Dec 16 12:17:47.912000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.21.180:22-139.178.68.195:55284 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:47.915520 systemd[1]: session-12.scope: Deactivated successfully. Dec 16 12:17:47.917467 systemd-logind[1646]: Removed session 12. Dec 16 12:17:47.971091 containerd[1680]: time="2025-12-16T12:17:47.971037909Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:17:47.972537 containerd[1680]: time="2025-12-16T12:17:47.972489076Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:17:47.972637 containerd[1680]: time="2025-12-16T12:17:47.972579877Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:17:47.972920 kubelet[2884]: E1216 12:17:47.972780 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:17:47.972970 kubelet[2884]: E1216 12:17:47.972927 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:17:47.973224 kubelet[2884]: E1216 12:17:47.973156 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ptdjf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-6b9b8c464c-4jgjb_calico-system(384a06af-c494-40e9-b0a8-31b5c5a33ae4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:17:47.974361 kubelet[2884]: E1216 12:17:47.974317 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b9b8c464c-4jgjb" podUID="384a06af-c494-40e9-b0a8-31b5c5a33ae4" Dec 16 12:17:49.650057 kubelet[2884]: E1216 12:17:49.650016 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9bb48c66-mjzv6" podUID="576d8526-5af6-453c-afc4-7ebd613c4146" Dec 16 12:17:50.651349 kubelet[2884]: E1216 12:17:50.651296 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v5bnx" podUID="896c3574-3482-4970-a592-5c7752aa620e" Dec 16 12:17:50.652591 containerd[1680]: time="2025-12-16T12:17:50.651891061Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:17:50.991612 containerd[1680]: time="2025-12-16T12:17:50.991502833Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:17:50.993589 containerd[1680]: time="2025-12-16T12:17:50.993547563Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:17:50.993738 containerd[1680]: time="2025-12-16T12:17:50.993619404Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:17:50.993806 kubelet[2884]: E1216 12:17:50.993763 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:17:50.993841 kubelet[2884]: E1216 12:17:50.993810 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:17:50.993961 kubelet[2884]: E1216 12:17:50.993925 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qbzzt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-2hkqn_calico-system(504cf836-455c-42a5-8d68-245e5d4890cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:17:50.996772 containerd[1680]: time="2025-12-16T12:17:50.996750020Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:17:51.321747 containerd[1680]: time="2025-12-16T12:17:51.321627037Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:17:51.329125 containerd[1680]: time="2025-12-16T12:17:51.329053275Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:17:51.329295 containerd[1680]: time="2025-12-16T12:17:51.329113675Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:17:51.329667 kubelet[2884]: E1216 12:17:51.329435 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:17:51.329667 kubelet[2884]: E1216 12:17:51.329485 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:17:51.329667 kubelet[2884]: E1216 12:17:51.329610 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qbzzt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-2hkqn_calico-system(504cf836-455c-42a5-8d68-245e5d4890cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:17:51.330815 kubelet[2884]: E1216 12:17:51.330758 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-2hkqn" podUID="504cf836-455c-42a5-8d68-245e5d4890cf" Dec 16 12:17:51.649835 kubelet[2884]: E1216 12:17:51.649450 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-sbbxz" podUID="13366e02-1117-46c3-a880-8d8cc6c423f8" Dec 16 12:17:53.088000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.21.180:22-139.178.68.195:53420 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:53.089291 systemd[1]: Started sshd@11-10.0.21.180:22-139.178.68.195:53420.service - OpenSSH per-connection server daemon (139.178.68.195:53420). Dec 16 12:17:53.092317 kernel: kauditd_printk_skb: 23 callbacks suppressed Dec 16 12:17:53.092395 kernel: audit: type=1130 audit(1765887473.088:796): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.21.180:22-139.178.68.195:53420 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:53.975000 audit[5396]: USER_ACCT pid=5396 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:53.977195 sshd[5396]: Accepted publickey for core from 139.178.68.195 port 53420 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:17:53.978000 audit[5396]: CRED_ACQ pid=5396 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:53.980702 sshd-session[5396]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:17:53.982610 kernel: audit: type=1101 audit(1765887473.975:797): pid=5396 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:53.982743 kernel: audit: type=1103 audit(1765887473.978:798): pid=5396 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:53.982803 kernel: audit: type=1006 audit(1765887473.978:799): pid=5396 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=13 res=1 Dec 16 12:17:53.978000 audit[5396]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff0b175a0 a2=3 a3=0 items=0 ppid=1 pid=5396 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:53.987182 kernel: audit: type=1300 audit(1765887473.978:799): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff0b175a0 a2=3 a3=0 items=0 ppid=1 pid=5396 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:17:53.987247 kernel: audit: type=1327 audit(1765887473.978:799): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:17:53.978000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:17:53.986573 systemd-logind[1646]: New session 13 of user core. Dec 16 12:17:53.993448 systemd[1]: Started session-13.scope - Session 13 of User core. Dec 16 12:17:53.994000 audit[5396]: USER_START pid=5396 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:53.996000 audit[5400]: CRED_ACQ pid=5400 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:54.001554 kernel: audit: type=1105 audit(1765887473.994:800): pid=5396 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:54.001616 kernel: audit: type=1103 audit(1765887473.996:801): pid=5400 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:54.546050 sshd[5400]: Connection closed by 139.178.68.195 port 53420 Dec 16 12:17:54.546477 sshd-session[5396]: pam_unix(sshd:session): session closed for user core Dec 16 12:17:54.546000 audit[5396]: USER_END pid=5396 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:54.550451 systemd[1]: sshd@11-10.0.21.180:22-139.178.68.195:53420.service: Deactivated successfully. Dec 16 12:17:54.546000 audit[5396]: CRED_DISP pid=5396 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:54.553221 systemd[1]: session-13.scope: Deactivated successfully. Dec 16 12:17:54.553744 kernel: audit: type=1106 audit(1765887474.546:802): pid=5396 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:54.553785 kernel: audit: type=1104 audit(1765887474.546:803): pid=5396 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:17:54.550000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.21.180:22-139.178.68.195:53420 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:54.554376 systemd-logind[1646]: Session 13 logged out. Waiting for processes to exit. Dec 16 12:17:54.555666 systemd-logind[1646]: Removed session 13. Dec 16 12:17:54.649639 kubelet[2884]: E1216 12:17:54.649303 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-5nkhx" podUID="3cfa8afe-d370-4d42-b9ea-f53cfd764b71" Dec 16 12:17:58.650633 kubelet[2884]: E1216 12:17:58.650586 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-69fc96cf55-8lskp" podUID="9883af1d-f9ff-4212-ac38-34ecc575631c" Dec 16 12:17:59.717359 systemd[1]: Started sshd@12-10.0.21.180:22-139.178.68.195:53422.service - OpenSSH per-connection server daemon (139.178.68.195:53422). Dec 16 12:17:59.716000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.21.180:22-139.178.68.195:53422 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:17:59.718169 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:17:59.718226 kernel: audit: type=1130 audit(1765887479.716:805): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.21.180:22-139.178.68.195:53422 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:00.534000 audit[5416]: USER_ACCT pid=5416 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:00.535508 sshd[5416]: Accepted publickey for core from 139.178.68.195 port 53422 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:18:00.536000 audit[5416]: CRED_ACQ pid=5416 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:00.538422 sshd-session[5416]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:18:00.540952 kernel: audit: type=1101 audit(1765887480.534:806): pid=5416 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:00.541100 kernel: audit: type=1103 audit(1765887480.536:807): pid=5416 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:00.541143 kernel: audit: type=1006 audit(1765887480.536:808): pid=5416 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=14 res=1 Dec 16 12:18:00.542592 kernel: audit: type=1300 audit(1765887480.536:808): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc3683060 a2=3 a3=0 items=0 ppid=1 pid=5416 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:00.536000 audit[5416]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc3683060 a2=3 a3=0 items=0 ppid=1 pid=5416 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:00.542678 systemd-logind[1646]: New session 14 of user core. Dec 16 12:18:00.536000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:18:00.546563 kernel: audit: type=1327 audit(1765887480.536:808): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:18:00.564474 systemd[1]: Started session-14.scope - Session 14 of User core. Dec 16 12:18:00.565000 audit[5416]: USER_START pid=5416 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:00.569000 audit[5420]: CRED_ACQ pid=5420 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:00.573398 kernel: audit: type=1105 audit(1765887480.565:809): pid=5416 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:00.573528 kernel: audit: type=1103 audit(1765887480.569:810): pid=5420 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:01.068128 sshd[5420]: Connection closed by 139.178.68.195 port 53422 Dec 16 12:18:01.068394 sshd-session[5416]: pam_unix(sshd:session): session closed for user core Dec 16 12:18:01.068000 audit[5416]: USER_END pid=5416 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:01.072679 systemd-logind[1646]: Session 14 logged out. Waiting for processes to exit. Dec 16 12:18:01.068000 audit[5416]: CRED_DISP pid=5416 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:01.073122 systemd[1]: sshd@12-10.0.21.180:22-139.178.68.195:53422.service: Deactivated successfully. Dec 16 12:18:01.074911 systemd[1]: session-14.scope: Deactivated successfully. Dec 16 12:18:01.075565 kernel: audit: type=1106 audit(1765887481.068:811): pid=5416 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:01.075614 kernel: audit: type=1104 audit(1765887481.068:812): pid=5416 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:01.072000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.21.180:22-139.178.68.195:53422 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:01.078550 systemd-logind[1646]: Removed session 14. Dec 16 12:18:03.649813 kubelet[2884]: E1216 12:18:03.649724 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b9b8c464c-4jgjb" podUID="384a06af-c494-40e9-b0a8-31b5c5a33ae4" Dec 16 12:18:03.649813 kubelet[2884]: E1216 12:18:03.649746 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9bb48c66-mjzv6" podUID="576d8526-5af6-453c-afc4-7ebd613c4146" Dec 16 12:18:03.649813 kubelet[2884]: E1216 12:18:03.649726 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v5bnx" podUID="896c3574-3482-4970-a592-5c7752aa620e" Dec 16 12:18:05.649740 kubelet[2884]: E1216 12:18:05.649584 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-sbbxz" podUID="13366e02-1117-46c3-a880-8d8cc6c423f8" Dec 16 12:18:06.239919 systemd[1]: Started sshd@13-10.0.21.180:22-139.178.68.195:47810.service - OpenSSH per-connection server daemon (139.178.68.195:47810). Dec 16 12:18:06.239000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.21.180:22-139.178.68.195:47810 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:06.243350 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:18:06.243427 kernel: audit: type=1130 audit(1765887486.239:814): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.21.180:22-139.178.68.195:47810 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:06.651817 kubelet[2884]: E1216 12:18:06.651633 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-2hkqn" podUID="504cf836-455c-42a5-8d68-245e5d4890cf" Dec 16 12:18:07.068000 audit[5461]: USER_ACCT pid=5461 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:07.069918 sshd[5461]: Accepted publickey for core from 139.178.68.195 port 47810 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:18:07.073101 kernel: audit: type=1101 audit(1765887487.068:815): pid=5461 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:07.072000 audit[5461]: CRED_ACQ pid=5461 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:07.074397 sshd-session[5461]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:18:07.077561 kernel: audit: type=1103 audit(1765887487.072:816): pid=5461 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:07.077623 kernel: audit: type=1006 audit(1765887487.072:817): pid=5461 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=15 res=1 Dec 16 12:18:07.077643 kernel: audit: type=1300 audit(1765887487.072:817): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc18a2cf0 a2=3 a3=0 items=0 ppid=1 pid=5461 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:07.072000 audit[5461]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc18a2cf0 a2=3 a3=0 items=0 ppid=1 pid=5461 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:07.072000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:18:07.081466 kernel: audit: type=1327 audit(1765887487.072:817): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:18:07.082244 systemd-logind[1646]: New session 15 of user core. Dec 16 12:18:07.091440 systemd[1]: Started session-15.scope - Session 15 of User core. Dec 16 12:18:07.093000 audit[5461]: USER_START pid=5461 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:07.096000 audit[5465]: CRED_ACQ pid=5465 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:07.100764 kernel: audit: type=1105 audit(1765887487.093:818): pid=5461 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:07.100819 kernel: audit: type=1103 audit(1765887487.096:819): pid=5465 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:07.617307 sshd[5465]: Connection closed by 139.178.68.195 port 47810 Dec 16 12:18:07.617602 sshd-session[5461]: pam_unix(sshd:session): session closed for user core Dec 16 12:18:07.618000 audit[5461]: USER_END pid=5461 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:07.622840 systemd[1]: sshd@13-10.0.21.180:22-139.178.68.195:47810.service: Deactivated successfully. Dec 16 12:18:07.626566 kernel: audit: type=1106 audit(1765887487.618:820): pid=5461 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:07.626600 kernel: audit: type=1104 audit(1765887487.618:821): pid=5461 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:07.618000 audit[5461]: CRED_DISP pid=5461 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:07.625935 systemd[1]: session-15.scope: Deactivated successfully. Dec 16 12:18:07.619000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.21.180:22-139.178.68.195:47810 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:07.627828 systemd-logind[1646]: Session 15 logged out. Waiting for processes to exit. Dec 16 12:18:07.632448 systemd-logind[1646]: Removed session 15. Dec 16 12:18:07.782000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.21.180:22-139.178.68.195:47814 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:07.783245 systemd[1]: Started sshd@14-10.0.21.180:22-139.178.68.195:47814.service - OpenSSH per-connection server daemon (139.178.68.195:47814). Dec 16 12:18:08.610000 audit[5478]: USER_ACCT pid=5478 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:08.611445 sshd[5478]: Accepted publickey for core from 139.178.68.195 port 47814 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:18:08.611000 audit[5478]: CRED_ACQ pid=5478 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:08.611000 audit[5478]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff2990e00 a2=3 a3=0 items=0 ppid=1 pid=5478 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:08.611000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:18:08.613153 sshd-session[5478]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:18:08.617135 systemd-logind[1646]: New session 16 of user core. Dec 16 12:18:08.629465 systemd[1]: Started session-16.scope - Session 16 of User core. Dec 16 12:18:08.630000 audit[5478]: USER_START pid=5478 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:08.632000 audit[5482]: CRED_ACQ pid=5482 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:09.227642 sshd[5482]: Connection closed by 139.178.68.195 port 47814 Dec 16 12:18:09.228274 sshd-session[5478]: pam_unix(sshd:session): session closed for user core Dec 16 12:18:09.228000 audit[5478]: USER_END pid=5478 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:09.228000 audit[5478]: CRED_DISP pid=5478 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:09.232648 systemd-logind[1646]: Session 16 logged out. Waiting for processes to exit. Dec 16 12:18:09.232839 systemd[1]: sshd@14-10.0.21.180:22-139.178.68.195:47814.service: Deactivated successfully. Dec 16 12:18:09.232000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.21.180:22-139.178.68.195:47814 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:09.234737 systemd[1]: session-16.scope: Deactivated successfully. Dec 16 12:18:09.236437 systemd-logind[1646]: Removed session 16. Dec 16 12:18:09.400517 systemd[1]: Started sshd@15-10.0.21.180:22-139.178.68.195:47820.service - OpenSSH per-connection server daemon (139.178.68.195:47820). Dec 16 12:18:09.399000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.21.180:22-139.178.68.195:47820 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:09.650587 kubelet[2884]: E1216 12:18:09.650382 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-5nkhx" podUID="3cfa8afe-d370-4d42-b9ea-f53cfd764b71" Dec 16 12:18:10.223000 audit[5494]: USER_ACCT pid=5494 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:10.225235 sshd[5494]: Accepted publickey for core from 139.178.68.195 port 47820 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:18:10.225000 audit[5494]: CRED_ACQ pid=5494 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:10.225000 audit[5494]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcea52490 a2=3 a3=0 items=0 ppid=1 pid=5494 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:10.225000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:18:10.227214 sshd-session[5494]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:18:10.231530 systemd-logind[1646]: New session 17 of user core. Dec 16 12:18:10.242348 systemd[1]: Started session-17.scope - Session 17 of User core. Dec 16 12:18:10.244000 audit[5494]: USER_START pid=5494 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:10.246000 audit[5498]: CRED_ACQ pid=5498 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:10.651860 kubelet[2884]: E1216 12:18:10.651719 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-69fc96cf55-8lskp" podUID="9883af1d-f9ff-4212-ac38-34ecc575631c" Dec 16 12:18:11.304000 audit[5510]: NETFILTER_CFG table=filter:149 family=2 entries=26 op=nft_register_rule pid=5510 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:18:11.306662 kernel: kauditd_printk_skb: 20 callbacks suppressed Dec 16 12:18:11.306845 kernel: audit: type=1325 audit(1765887491.304:838): table=filter:149 family=2 entries=26 op=nft_register_rule pid=5510 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:18:11.304000 audit[5510]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=ffffc750d2d0 a2=0 a3=1 items=0 ppid=2997 pid=5510 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:11.313912 kernel: audit: type=1300 audit(1765887491.304:838): arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=ffffc750d2d0 a2=0 a3=1 items=0 ppid=2997 pid=5510 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:11.317432 kernel: audit: type=1327 audit(1765887491.304:838): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:18:11.304000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:18:11.315000 audit[5510]: NETFILTER_CFG table=nat:150 family=2 entries=20 op=nft_register_rule pid=5510 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:18:11.319527 kernel: audit: type=1325 audit(1765887491.315:839): table=nat:150 family=2 entries=20 op=nft_register_rule pid=5510 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:18:11.319633 kernel: audit: type=1300 audit(1765887491.315:839): arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffc750d2d0 a2=0 a3=1 items=0 ppid=2997 pid=5510 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:11.315000 audit[5510]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffc750d2d0 a2=0 a3=1 items=0 ppid=2997 pid=5510 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:11.315000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:18:11.325116 kernel: audit: type=1327 audit(1765887491.315:839): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:18:11.336000 audit[5512]: NETFILTER_CFG table=filter:151 family=2 entries=38 op=nft_register_rule pid=5512 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:18:11.336000 audit[5512]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=ffffe0570b60 a2=0 a3=1 items=0 ppid=2997 pid=5512 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:11.343443 kernel: audit: type=1325 audit(1765887491.336:840): table=filter:151 family=2 entries=38 op=nft_register_rule pid=5512 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:18:11.343511 kernel: audit: type=1300 audit(1765887491.336:840): arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=ffffe0570b60 a2=0 a3=1 items=0 ppid=2997 pid=5512 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:11.336000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:18:11.346416 kernel: audit: type=1327 audit(1765887491.336:840): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:18:11.347000 audit[5512]: NETFILTER_CFG table=nat:152 family=2 entries=20 op=nft_register_rule pid=5512 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:18:11.347000 audit[5512]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffe0570b60 a2=0 a3=1 items=0 ppid=2997 pid=5512 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:11.347000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:18:11.352097 kernel: audit: type=1325 audit(1765887491.347:841): table=nat:152 family=2 entries=20 op=nft_register_rule pid=5512 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:18:11.483121 sshd[5498]: Connection closed by 139.178.68.195 port 47820 Dec 16 12:18:11.483047 sshd-session[5494]: pam_unix(sshd:session): session closed for user core Dec 16 12:18:11.483000 audit[5494]: USER_END pid=5494 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:11.483000 audit[5494]: CRED_DISP pid=5494 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:11.487342 systemd-logind[1646]: Session 17 logged out. Waiting for processes to exit. Dec 16 12:18:11.487518 systemd[1]: sshd@15-10.0.21.180:22-139.178.68.195:47820.service: Deactivated successfully. Dec 16 12:18:11.486000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.21.180:22-139.178.68.195:47820 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:11.489533 systemd[1]: session-17.scope: Deactivated successfully. Dec 16 12:18:11.491352 systemd-logind[1646]: Removed session 17. Dec 16 12:18:11.656696 systemd[1]: Started sshd@16-10.0.21.180:22-139.178.68.195:55314.service - OpenSSH per-connection server daemon (139.178.68.195:55314). Dec 16 12:18:11.655000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.21.180:22-139.178.68.195:55314 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:12.481000 audit[5517]: USER_ACCT pid=5517 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:12.482485 sshd[5517]: Accepted publickey for core from 139.178.68.195 port 55314 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:18:12.483000 audit[5517]: CRED_ACQ pid=5517 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:12.483000 audit[5517]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc3e9b440 a2=3 a3=0 items=0 ppid=1 pid=5517 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:12.483000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:18:12.485320 sshd-session[5517]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:18:12.490226 systemd-logind[1646]: New session 18 of user core. Dec 16 12:18:12.498413 systemd[1]: Started session-18.scope - Session 18 of User core. Dec 16 12:18:12.500000 audit[5517]: USER_START pid=5517 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:12.501000 audit[5521]: CRED_ACQ pid=5521 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:13.155463 sshd[5521]: Connection closed by 139.178.68.195 port 55314 Dec 16 12:18:13.155769 sshd-session[5517]: pam_unix(sshd:session): session closed for user core Dec 16 12:18:13.156000 audit[5517]: USER_END pid=5517 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:13.156000 audit[5517]: CRED_DISP pid=5517 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:13.160994 systemd[1]: sshd@16-10.0.21.180:22-139.178.68.195:55314.service: Deactivated successfully. Dec 16 12:18:13.161000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.21.180:22-139.178.68.195:55314 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:13.164285 systemd[1]: session-18.scope: Deactivated successfully. Dec 16 12:18:13.166812 systemd-logind[1646]: Session 18 logged out. Waiting for processes to exit. Dec 16 12:18:13.170465 systemd-logind[1646]: Removed session 18. Dec 16 12:18:13.321000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.21.180:22-139.178.68.195:55326 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:13.322765 systemd[1]: Started sshd@17-10.0.21.180:22-139.178.68.195:55326.service - OpenSSH per-connection server daemon (139.178.68.195:55326). Dec 16 12:18:14.146000 audit[5533]: USER_ACCT pid=5533 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:14.147715 sshd[5533]: Accepted publickey for core from 139.178.68.195 port 55326 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:18:14.147000 audit[5533]: CRED_ACQ pid=5533 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:14.147000 audit[5533]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc1229580 a2=3 a3=0 items=0 ppid=1 pid=5533 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:14.147000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:18:14.149508 sshd-session[5533]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:18:14.154278 systemd-logind[1646]: New session 19 of user core. Dec 16 12:18:14.162267 systemd[1]: Started session-19.scope - Session 19 of User core. Dec 16 12:18:14.164000 audit[5533]: USER_START pid=5533 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:14.166000 audit[5537]: CRED_ACQ pid=5537 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:14.705208 sshd[5537]: Connection closed by 139.178.68.195 port 55326 Dec 16 12:18:14.705973 sshd-session[5533]: pam_unix(sshd:session): session closed for user core Dec 16 12:18:14.706000 audit[5533]: USER_END pid=5533 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:14.706000 audit[5533]: CRED_DISP pid=5533 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:14.710984 systemd[1]: sshd@17-10.0.21.180:22-139.178.68.195:55326.service: Deactivated successfully. Dec 16 12:18:14.710000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.21.180:22-139.178.68.195:55326 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:14.712977 systemd[1]: session-19.scope: Deactivated successfully. Dec 16 12:18:14.713843 systemd-logind[1646]: Session 19 logged out. Waiting for processes to exit. Dec 16 12:18:14.714895 systemd-logind[1646]: Removed session 19. Dec 16 12:18:15.650487 kubelet[2884]: E1216 12:18:15.650289 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b9b8c464c-4jgjb" podUID="384a06af-c494-40e9-b0a8-31b5c5a33ae4" Dec 16 12:18:15.803000 audit[5551]: NETFILTER_CFG table=filter:153 family=2 entries=26 op=nft_register_rule pid=5551 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:18:15.803000 audit[5551]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffd5a493f0 a2=0 a3=1 items=0 ppid=2997 pid=5551 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.803000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:18:15.808000 audit[5551]: NETFILTER_CFG table=nat:154 family=2 entries=104 op=nft_register_chain pid=5551 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:18:15.808000 audit[5551]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=48684 a0=3 a1=ffffd5a493f0 a2=0 a3=1 items=0 ppid=2997 pid=5551 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:15.808000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:18:16.650260 kubelet[2884]: E1216 12:18:16.650057 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9bb48c66-mjzv6" podUID="576d8526-5af6-453c-afc4-7ebd613c4146" Dec 16 12:18:17.651271 kubelet[2884]: E1216 12:18:17.651207 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v5bnx" podUID="896c3574-3482-4970-a592-5c7752aa620e" Dec 16 12:18:20.650150 kubelet[2884]: E1216 12:18:20.650057 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-sbbxz" podUID="13366e02-1117-46c3-a880-8d8cc6c423f8" Dec 16 12:18:20.915000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.21.180:22-139.178.68.195:55342 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:20.916041 systemd[1]: Started sshd@18-10.0.21.180:22-139.178.68.195:55342.service - OpenSSH per-connection server daemon (139.178.68.195:55342). Dec 16 12:18:20.919905 kernel: kauditd_printk_skb: 33 callbacks suppressed Dec 16 12:18:20.920013 kernel: audit: type=1130 audit(1765887500.915:865): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.21.180:22-139.178.68.195:55342 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:21.650425 kubelet[2884]: E1216 12:18:21.650382 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-2hkqn" podUID="504cf836-455c-42a5-8d68-245e5d4890cf" Dec 16 12:18:21.743000 audit[5555]: USER_ACCT pid=5555 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:21.745316 sshd[5555]: Accepted publickey for core from 139.178.68.195 port 55342 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:18:21.746000 audit[5555]: CRED_ACQ pid=5555 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:21.748611 sshd-session[5555]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:18:21.750338 kernel: audit: type=1101 audit(1765887501.743:866): pid=5555 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:21.750406 kernel: audit: type=1103 audit(1765887501.746:867): pid=5555 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:21.752117 kernel: audit: type=1006 audit(1765887501.746:868): pid=5555 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=20 res=1 Dec 16 12:18:21.752179 kernel: audit: type=1300 audit(1765887501.746:868): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc01217d0 a2=3 a3=0 items=0 ppid=1 pid=5555 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:21.746000 audit[5555]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc01217d0 a2=3 a3=0 items=0 ppid=1 pid=5555 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:21.754497 systemd-logind[1646]: New session 20 of user core. Dec 16 12:18:21.755129 kernel: audit: type=1327 audit(1765887501.746:868): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:18:21.746000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:18:21.764258 systemd[1]: Started session-20.scope - Session 20 of User core. Dec 16 12:18:21.765000 audit[5555]: USER_START pid=5555 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:21.769000 audit[5559]: CRED_ACQ pid=5559 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:21.773587 kernel: audit: type=1105 audit(1765887501.765:869): pid=5555 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:21.773646 kernel: audit: type=1103 audit(1765887501.769:870): pid=5559 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:22.279303 sshd[5559]: Connection closed by 139.178.68.195 port 55342 Dec 16 12:18:22.279754 sshd-session[5555]: pam_unix(sshd:session): session closed for user core Dec 16 12:18:22.280000 audit[5555]: USER_END pid=5555 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:22.284351 systemd[1]: sshd@18-10.0.21.180:22-139.178.68.195:55342.service: Deactivated successfully. Dec 16 12:18:22.280000 audit[5555]: CRED_DISP pid=5555 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:22.286215 systemd[1]: session-20.scope: Deactivated successfully. Dec 16 12:18:22.286948 systemd-logind[1646]: Session 20 logged out. Waiting for processes to exit. Dec 16 12:18:22.287668 kernel: audit: type=1106 audit(1765887502.280:871): pid=5555 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:22.287721 kernel: audit: type=1104 audit(1765887502.280:872): pid=5555 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:22.280000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.21.180:22-139.178.68.195:55342 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:22.288378 systemd-logind[1646]: Removed session 20. Dec 16 12:18:24.652096 kubelet[2884]: E1216 12:18:24.650120 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-5nkhx" podUID="3cfa8afe-d370-4d42-b9ea-f53cfd764b71" Dec 16 12:18:25.650829 kubelet[2884]: E1216 12:18:25.650732 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-69fc96cf55-8lskp" podUID="9883af1d-f9ff-4212-ac38-34ecc575631c" Dec 16 12:18:27.455798 systemd[1]: Started sshd@19-10.0.21.180:22-139.178.68.195:51758.service - OpenSSH per-connection server daemon (139.178.68.195:51758). Dec 16 12:18:27.455000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.21.180:22-139.178.68.195:51758 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:27.459219 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:18:27.459284 kernel: audit: type=1130 audit(1765887507.455:874): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.21.180:22-139.178.68.195:51758 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:27.650325 kubelet[2884]: E1216 12:18:27.649049 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b9b8c464c-4jgjb" podUID="384a06af-c494-40e9-b0a8-31b5c5a33ae4" Dec 16 12:18:27.650325 kubelet[2884]: E1216 12:18:27.649103 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9bb48c66-mjzv6" podUID="576d8526-5af6-453c-afc4-7ebd613c4146" Dec 16 12:18:28.292000 audit[5575]: USER_ACCT pid=5575 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:28.294233 sshd[5575]: Accepted publickey for core from 139.178.68.195 port 51758 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:18:28.296000 audit[5575]: CRED_ACQ pid=5575 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:28.298104 kernel: audit: type=1101 audit(1765887508.292:875): pid=5575 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:28.298145 kernel: audit: type=1103 audit(1765887508.296:876): pid=5575 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:28.298534 sshd-session[5575]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:18:28.302165 kernel: audit: type=1006 audit(1765887508.296:877): pid=5575 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=21 res=1 Dec 16 12:18:28.296000 audit[5575]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc2dddbe0 a2=3 a3=0 items=0 ppid=1 pid=5575 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:28.305487 kernel: audit: type=1300 audit(1765887508.296:877): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc2dddbe0 a2=3 a3=0 items=0 ppid=1 pid=5575 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:28.296000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:18:28.306869 kernel: audit: type=1327 audit(1765887508.296:877): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:18:28.309259 systemd-logind[1646]: New session 21 of user core. Dec 16 12:18:28.320367 systemd[1]: Started session-21.scope - Session 21 of User core. Dec 16 12:18:28.321000 audit[5575]: USER_START pid=5575 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:28.322000 audit[5579]: CRED_ACQ pid=5579 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:28.328296 kernel: audit: type=1105 audit(1765887508.321:878): pid=5575 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:28.328375 kernel: audit: type=1103 audit(1765887508.322:879): pid=5579 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:28.834102 sshd[5579]: Connection closed by 139.178.68.195 port 51758 Dec 16 12:18:28.834836 sshd-session[5575]: pam_unix(sshd:session): session closed for user core Dec 16 12:18:28.835000 audit[5575]: USER_END pid=5575 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:28.838782 systemd[1]: sshd@19-10.0.21.180:22-139.178.68.195:51758.service: Deactivated successfully. Dec 16 12:18:28.835000 audit[5575]: CRED_DISP pid=5575 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:28.840574 systemd[1]: session-21.scope: Deactivated successfully. Dec 16 12:18:28.842180 kernel: audit: type=1106 audit(1765887508.835:880): pid=5575 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:28.842250 kernel: audit: type=1104 audit(1765887508.835:881): pid=5575 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:28.838000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.21.180:22-139.178.68.195:51758 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:28.842188 systemd-logind[1646]: Session 21 logged out. Waiting for processes to exit. Dec 16 12:18:28.843473 systemd-logind[1646]: Removed session 21. Dec 16 12:18:31.649951 kubelet[2884]: E1216 12:18:31.649914 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v5bnx" podUID="896c3574-3482-4970-a592-5c7752aa620e" Dec 16 12:18:31.650365 kubelet[2884]: E1216 12:18:31.650311 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-sbbxz" podUID="13366e02-1117-46c3-a880-8d8cc6c423f8" Dec 16 12:18:33.996000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.21.180:22-139.178.68.195:53406 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:33.998325 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:18:33.998383 kernel: audit: type=1130 audit(1765887513.996:883): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.21.180:22-139.178.68.195:53406 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:33.997473 systemd[1]: Started sshd@20-10.0.21.180:22-139.178.68.195:53406.service - OpenSSH per-connection server daemon (139.178.68.195:53406). Dec 16 12:18:34.651149 kubelet[2884]: E1216 12:18:34.651060 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-2hkqn" podUID="504cf836-455c-42a5-8d68-245e5d4890cf" Dec 16 12:18:34.831000 audit[5592]: USER_ACCT pid=5592 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:34.833326 sshd[5592]: Accepted publickey for core from 139.178.68.195 port 53406 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:18:34.836179 kernel: audit: type=1101 audit(1765887514.831:884): pid=5592 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:34.835000 audit[5592]: CRED_ACQ pid=5592 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:34.837488 sshd-session[5592]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:18:34.840559 kernel: audit: type=1103 audit(1765887514.835:885): pid=5592 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:34.840631 kernel: audit: type=1006 audit(1765887514.835:886): pid=5592 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=22 res=1 Dec 16 12:18:34.840663 kernel: audit: type=1300 audit(1765887514.835:886): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd61acd80 a2=3 a3=0 items=0 ppid=1 pid=5592 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:34.835000 audit[5592]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd61acd80 a2=3 a3=0 items=0 ppid=1 pid=5592 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:34.835000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:18:34.844643 kernel: audit: type=1327 audit(1765887514.835:886): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:18:34.849490 systemd-logind[1646]: New session 22 of user core. Dec 16 12:18:34.861538 systemd[1]: Started session-22.scope - Session 22 of User core. Dec 16 12:18:34.862000 audit[5592]: USER_START pid=5592 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:34.864000 audit[5622]: CRED_ACQ pid=5622 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:34.870089 kernel: audit: type=1105 audit(1765887514.862:887): pid=5592 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:34.870183 kernel: audit: type=1103 audit(1765887514.864:888): pid=5622 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:35.394890 sshd[5622]: Connection closed by 139.178.68.195 port 53406 Dec 16 12:18:35.395216 sshd-session[5592]: pam_unix(sshd:session): session closed for user core Dec 16 12:18:35.397000 audit[5592]: USER_END pid=5592 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:35.401191 systemd[1]: sshd@20-10.0.21.180:22-139.178.68.195:53406.service: Deactivated successfully. Dec 16 12:18:35.397000 audit[5592]: CRED_DISP pid=5592 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:35.404241 kernel: audit: type=1106 audit(1765887515.397:889): pid=5592 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:35.404307 kernel: audit: type=1104 audit(1765887515.397:890): pid=5592 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:35.400000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.21.180:22-139.178.68.195:53406 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:35.404672 systemd[1]: session-22.scope: Deactivated successfully. Dec 16 12:18:35.407314 systemd-logind[1646]: Session 22 logged out. Waiting for processes to exit. Dec 16 12:18:35.408036 systemd-logind[1646]: Removed session 22. Dec 16 12:18:37.650181 kubelet[2884]: E1216 12:18:37.650139 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-5nkhx" podUID="3cfa8afe-d370-4d42-b9ea-f53cfd764b71" Dec 16 12:18:38.650897 kubelet[2884]: E1216 12:18:38.650562 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9bb48c66-mjzv6" podUID="576d8526-5af6-453c-afc4-7ebd613c4146" Dec 16 12:18:39.650040 kubelet[2884]: E1216 12:18:39.649980 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-69fc96cf55-8lskp" podUID="9883af1d-f9ff-4212-ac38-34ecc575631c" Dec 16 12:18:40.570528 systemd[1]: Started sshd@21-10.0.21.180:22-139.178.68.195:44000.service - OpenSSH per-connection server daemon (139.178.68.195:44000). Dec 16 12:18:40.569000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.21.180:22-139.178.68.195:44000 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:40.571252 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:18:40.571297 kernel: audit: type=1130 audit(1765887520.569:892): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.21.180:22-139.178.68.195:44000 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:41.400000 audit[5635]: USER_ACCT pid=5635 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:41.402240 sshd[5635]: Accepted publickey for core from 139.178.68.195 port 44000 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:18:41.404582 sshd-session[5635]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:18:41.402000 audit[5635]: CRED_ACQ pid=5635 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:41.407673 kernel: audit: type=1101 audit(1765887521.400:893): pid=5635 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:41.407737 kernel: audit: type=1103 audit(1765887521.402:894): pid=5635 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:41.407756 kernel: audit: type=1006 audit(1765887521.402:895): pid=5635 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=23 res=1 Dec 16 12:18:41.408722 systemd-logind[1646]: New session 23 of user core. Dec 16 12:18:41.402000 audit[5635]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffde5ef4b0 a2=3 a3=0 items=0 ppid=1 pid=5635 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:41.412217 kernel: audit: type=1300 audit(1765887521.402:895): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffde5ef4b0 a2=3 a3=0 items=0 ppid=1 pid=5635 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:41.412292 kernel: audit: type=1327 audit(1765887521.402:895): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:18:41.402000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:18:41.415268 systemd[1]: Started session-23.scope - Session 23 of User core. Dec 16 12:18:41.416000 audit[5635]: USER_START pid=5635 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:41.418000 audit[5639]: CRED_ACQ pid=5639 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:41.423663 kernel: audit: type=1105 audit(1765887521.416:896): pid=5635 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:41.423709 kernel: audit: type=1103 audit(1765887521.418:897): pid=5639 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:41.649506 kubelet[2884]: E1216 12:18:41.649422 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b9b8c464c-4jgjb" podUID="384a06af-c494-40e9-b0a8-31b5c5a33ae4" Dec 16 12:18:41.957251 sshd[5639]: Connection closed by 139.178.68.195 port 44000 Dec 16 12:18:41.957575 sshd-session[5635]: pam_unix(sshd:session): session closed for user core Dec 16 12:18:41.958000 audit[5635]: USER_END pid=5635 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:41.962410 systemd[1]: sshd@21-10.0.21.180:22-139.178.68.195:44000.service: Deactivated successfully. Dec 16 12:18:41.958000 audit[5635]: CRED_DISP pid=5635 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:41.964256 systemd[1]: session-23.scope: Deactivated successfully. Dec 16 12:18:41.966231 kernel: audit: type=1106 audit(1765887521.958:898): pid=5635 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:41.966310 kernel: audit: type=1104 audit(1765887521.958:899): pid=5635 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:41.965840 systemd-logind[1646]: Session 23 logged out. Waiting for processes to exit. Dec 16 12:18:41.961000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.21.180:22-139.178.68.195:44000 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:41.967143 systemd-logind[1646]: Removed session 23. Dec 16 12:18:45.650128 kubelet[2884]: E1216 12:18:45.650063 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v5bnx" podUID="896c3574-3482-4970-a592-5c7752aa620e" Dec 16 12:18:46.650097 kubelet[2884]: E1216 12:18:46.649871 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-sbbxz" podUID="13366e02-1117-46c3-a880-8d8cc6c423f8" Dec 16 12:18:47.125000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.21.180:22-139.178.68.195:44006 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:47.126907 systemd[1]: Started sshd@22-10.0.21.180:22-139.178.68.195:44006.service - OpenSSH per-connection server daemon (139.178.68.195:44006). Dec 16 12:18:47.127761 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:18:47.127801 kernel: audit: type=1130 audit(1765887527.125:901): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.21.180:22-139.178.68.195:44006 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:47.956000 audit[5659]: USER_ACCT pid=5659 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:47.957698 sshd[5659]: Accepted publickey for core from 139.178.68.195 port 44006 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:18:47.960000 audit[5659]: CRED_ACQ pid=5659 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:47.962485 sshd-session[5659]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:18:47.965117 kernel: audit: type=1101 audit(1765887527.956:902): pid=5659 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:47.965178 kernel: audit: type=1103 audit(1765887527.960:903): pid=5659 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:47.965230 kernel: audit: type=1006 audit(1765887527.960:904): pid=5659 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=24 res=1 Dec 16 12:18:47.960000 audit[5659]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc36839a0 a2=3 a3=0 items=0 ppid=1 pid=5659 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:47.970303 kernel: audit: type=1300 audit(1765887527.960:904): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc36839a0 a2=3 a3=0 items=0 ppid=1 pid=5659 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:18:47.960000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:18:47.971758 kernel: audit: type=1327 audit(1765887527.960:904): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:18:47.973606 systemd-logind[1646]: New session 24 of user core. Dec 16 12:18:47.980276 systemd[1]: Started session-24.scope - Session 24 of User core. Dec 16 12:18:47.983000 audit[5659]: USER_START pid=5659 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:47.987000 audit[5663]: CRED_ACQ pid=5663 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:47.991570 kernel: audit: type=1105 audit(1765887527.983:905): pid=5659 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:47.991663 kernel: audit: type=1103 audit(1765887527.987:906): pid=5663 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:48.493100 sshd[5663]: Connection closed by 139.178.68.195 port 44006 Dec 16 12:18:48.494257 sshd-session[5659]: pam_unix(sshd:session): session closed for user core Dec 16 12:18:48.495000 audit[5659]: USER_END pid=5659 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:48.495000 audit[5659]: CRED_DISP pid=5659 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:48.502877 kernel: audit: type=1106 audit(1765887528.495:907): pid=5659 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:48.503042 kernel: audit: type=1104 audit(1765887528.495:908): pid=5659 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:18:48.503228 systemd[1]: sshd@22-10.0.21.180:22-139.178.68.195:44006.service: Deactivated successfully. Dec 16 12:18:48.502000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.21.180:22-139.178.68.195:44006 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:18:48.505148 systemd[1]: session-24.scope: Deactivated successfully. Dec 16 12:18:48.507302 systemd-logind[1646]: Session 24 logged out. Waiting for processes to exit. Dec 16 12:18:48.508809 systemd-logind[1646]: Removed session 24. Dec 16 12:18:49.649673 kubelet[2884]: E1216 12:18:49.649628 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-2hkqn" podUID="504cf836-455c-42a5-8d68-245e5d4890cf" Dec 16 12:18:51.649294 kubelet[2884]: E1216 12:18:51.649232 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9bb48c66-mjzv6" podUID="576d8526-5af6-453c-afc4-7ebd613c4146" Dec 16 12:18:52.653012 kubelet[2884]: E1216 12:18:52.652697 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-5nkhx" podUID="3cfa8afe-d370-4d42-b9ea-f53cfd764b71" Dec 16 12:18:52.653647 containerd[1680]: time="2025-12-16T12:18:52.652992619Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:18:53.038147 containerd[1680]: time="2025-12-16T12:18:53.037919422Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:18:53.039501 containerd[1680]: time="2025-12-16T12:18:53.039446550Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:18:53.039601 containerd[1680]: time="2025-12-16T12:18:53.039546150Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:18:53.039962 kubelet[2884]: E1216 12:18:53.039744 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:18:53.039962 kubelet[2884]: E1216 12:18:53.039813 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:18:53.039962 kubelet[2884]: E1216 12:18:53.039923 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:bce81bb11dd34fd8b7a0b5197b60303d,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-2s2rj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-69fc96cf55-8lskp_calico-system(9883af1d-f9ff-4212-ac38-34ecc575631c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:18:53.042392 containerd[1680]: time="2025-12-16T12:18:53.042356405Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:18:53.375934 containerd[1680]: time="2025-12-16T12:18:53.375829825Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:18:53.377147 containerd[1680]: time="2025-12-16T12:18:53.377096192Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:18:53.377234 containerd[1680]: time="2025-12-16T12:18:53.377188272Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:18:53.377395 kubelet[2884]: E1216 12:18:53.377361 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:18:53.378180 kubelet[2884]: E1216 12:18:53.378115 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:18:53.378464 kubelet[2884]: E1216 12:18:53.378290 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2s2rj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-69fc96cf55-8lskp_calico-system(9883af1d-f9ff-4212-ac38-34ecc575631c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:18:53.379464 kubelet[2884]: E1216 12:18:53.379430 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-69fc96cf55-8lskp" podUID="9883af1d-f9ff-4212-ac38-34ecc575631c" Dec 16 12:18:54.650331 kubelet[2884]: E1216 12:18:54.650270 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b9b8c464c-4jgjb" podUID="384a06af-c494-40e9-b0a8-31b5c5a33ae4" Dec 16 12:18:56.652866 containerd[1680]: time="2025-12-16T12:18:56.652811137Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:18:57.003846 containerd[1680]: time="2025-12-16T12:18:57.003720567Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:18:57.005352 containerd[1680]: time="2025-12-16T12:18:57.005308255Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:18:57.005499 containerd[1680]: time="2025-12-16T12:18:57.005385576Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:18:57.005584 kubelet[2884]: E1216 12:18:57.005539 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:18:57.005857 kubelet[2884]: E1216 12:18:57.005595 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:18:57.005857 kubelet[2884]: E1216 12:18:57.005722 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-p6kpv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-v5bnx_calico-system(896c3574-3482-4970-a592-5c7752aa620e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:18:57.007160 kubelet[2884]: E1216 12:18:57.007117 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v5bnx" podUID="896c3574-3482-4970-a592-5c7752aa620e" Dec 16 12:19:00.650089 containerd[1680]: time="2025-12-16T12:19:00.649802682Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:19:00.991299 containerd[1680]: time="2025-12-16T12:19:00.991231143Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:00.993614 containerd[1680]: time="2025-12-16T12:19:00.993562995Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:19:00.993720 containerd[1680]: time="2025-12-16T12:19:00.993610235Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:00.993836 kubelet[2884]: E1216 12:19:00.993792 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:19:00.994110 kubelet[2884]: E1216 12:19:00.993847 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:19:00.994110 kubelet[2884]: E1216 12:19:00.993970 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-l2lmg,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6784c79f67-sbbxz_calico-apiserver(13366e02-1117-46c3-a880-8d8cc6c423f8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:00.995416 kubelet[2884]: E1216 12:19:00.995131 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-sbbxz" podUID="13366e02-1117-46c3-a880-8d8cc6c423f8" Dec 16 12:19:01.650186 kubelet[2884]: E1216 12:19:01.650092 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-2hkqn" podUID="504cf836-455c-42a5-8d68-245e5d4890cf" Dec 16 12:19:04.650607 kubelet[2884]: E1216 12:19:04.650505 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-69fc96cf55-8lskp" podUID="9883af1d-f9ff-4212-ac38-34ecc575631c" Dec 16 12:19:06.652346 kubelet[2884]: E1216 12:19:06.652283 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b9b8c464c-4jgjb" podUID="384a06af-c494-40e9-b0a8-31b5c5a33ae4" Dec 16 12:19:06.653054 containerd[1680]: time="2025-12-16T12:19:06.652924217Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:19:06.984345 containerd[1680]: time="2025-12-16T12:19:06.984234106Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:06.985922 containerd[1680]: time="2025-12-16T12:19:06.985879835Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:19:06.985991 containerd[1680]: time="2025-12-16T12:19:06.985961035Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:06.986170 kubelet[2884]: E1216 12:19:06.986139 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:19:06.986224 kubelet[2884]: E1216 12:19:06.986184 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:19:06.986478 kubelet[2884]: E1216 12:19:06.986396 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-wjrmj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-6784c79f67-5nkhx_calico-apiserver(3cfa8afe-d370-4d42-b9ea-f53cfd764b71): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:06.986593 containerd[1680]: time="2025-12-16T12:19:06.986459678Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:19:06.988062 kubelet[2884]: E1216 12:19:06.988030 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-5nkhx" podUID="3cfa8afe-d370-4d42-b9ea-f53cfd764b71" Dec 16 12:19:07.340007 containerd[1680]: time="2025-12-16T12:19:07.339716039Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:07.342358 containerd[1680]: time="2025-12-16T12:19:07.342316493Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:19:07.342565 containerd[1680]: time="2025-12-16T12:19:07.342349933Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:07.342722 kubelet[2884]: E1216 12:19:07.342683 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:19:07.343025 kubelet[2884]: E1216 12:19:07.342834 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:19:07.343025 kubelet[2884]: E1216 12:19:07.342970 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ckh7w,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-9bb48c66-mjzv6_calico-apiserver(576d8526-5af6-453c-afc4-7ebd613c4146): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:07.344166 kubelet[2884]: E1216 12:19:07.344130 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9bb48c66-mjzv6" podUID="576d8526-5af6-453c-afc4-7ebd613c4146" Dec 16 12:19:08.652060 kubelet[2884]: E1216 12:19:08.652019 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v5bnx" podUID="896c3574-3482-4970-a592-5c7752aa620e" Dec 16 12:19:13.350714 systemd[1]: cri-containerd-8c9b7bee1f4cd81264f37cbbccb08bb0f37766a189d348586ee855de89576a4a.scope: Deactivated successfully. Dec 16 12:19:13.351244 systemd[1]: cri-containerd-8c9b7bee1f4cd81264f37cbbccb08bb0f37766a189d348586ee855de89576a4a.scope: Consumed 4.283s CPU time, 61.2M memory peak. Dec 16 12:19:13.350000 audit: BPF prog-id=261 op=LOAD Dec 16 12:19:13.352473 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:19:13.352530 kernel: audit: type=1334 audit(1765887553.350:910): prog-id=261 op=LOAD Dec 16 12:19:13.353423 containerd[1680]: time="2025-12-16T12:19:13.353391868Z" level=info msg="received container exit event container_id:\"8c9b7bee1f4cd81264f37cbbccb08bb0f37766a189d348586ee855de89576a4a\" id:\"8c9b7bee1f4cd81264f37cbbccb08bb0f37766a189d348586ee855de89576a4a\" pid:2729 exit_status:1 exited_at:{seconds:1765887553 nanos:352087822}" Dec 16 12:19:13.350000 audit: BPF prog-id=88 op=UNLOAD Dec 16 12:19:13.354224 kernel: audit: type=1334 audit(1765887553.350:911): prog-id=88 op=UNLOAD Dec 16 12:19:13.356990 kubelet[2884]: E1216 12:19:13.356951 2884 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.21.180:58986->10.0.21.190:2379: read: connection timed out" Dec 16 12:19:13.358000 audit: BPF prog-id=108 op=UNLOAD Dec 16 12:19:13.360680 systemd[1]: cri-containerd-38b102d742cb82f5aa2203bb460f99b2fb2845a493ea541e7f3b69bd0463a81d.scope: Deactivated successfully. Dec 16 12:19:13.358000 audit: BPF prog-id=112 op=UNLOAD Dec 16 12:19:13.361027 systemd[1]: cri-containerd-38b102d742cb82f5aa2203bb460f99b2fb2845a493ea541e7f3b69bd0463a81d.scope: Consumed 4.255s CPU time, 23.2M memory peak. Dec 16 12:19:13.361623 kernel: audit: type=1334 audit(1765887553.358:912): prog-id=108 op=UNLOAD Dec 16 12:19:13.361676 kernel: audit: type=1334 audit(1765887553.358:913): prog-id=112 op=UNLOAD Dec 16 12:19:13.360000 audit: BPF prog-id=262 op=LOAD Dec 16 12:19:13.361000 audit: BPF prog-id=93 op=UNLOAD Dec 16 12:19:13.363769 kernel: audit: type=1334 audit(1765887553.360:914): prog-id=262 op=LOAD Dec 16 12:19:13.363836 kernel: audit: type=1334 audit(1765887553.361:915): prog-id=93 op=UNLOAD Dec 16 12:19:13.364331 containerd[1680]: time="2025-12-16T12:19:13.364280564Z" level=info msg="received container exit event container_id:\"38b102d742cb82f5aa2203bb460f99b2fb2845a493ea541e7f3b69bd0463a81d\" id:\"38b102d742cb82f5aa2203bb460f99b2fb2845a493ea541e7f3b69bd0463a81d\" pid:2722 exit_status:1 exited_at:{seconds:1765887553 nanos:362411394}" Dec 16 12:19:13.365000 audit: BPF prog-id=103 op=UNLOAD Dec 16 12:19:13.365000 audit: BPF prog-id=107 op=UNLOAD Dec 16 12:19:13.367681 kernel: audit: type=1334 audit(1765887553.365:916): prog-id=103 op=UNLOAD Dec 16 12:19:13.367748 kernel: audit: type=1334 audit(1765887553.365:917): prog-id=107 op=UNLOAD Dec 16 12:19:13.385102 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-8c9b7bee1f4cd81264f37cbbccb08bb0f37766a189d348586ee855de89576a4a-rootfs.mount: Deactivated successfully. Dec 16 12:19:13.388387 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-38b102d742cb82f5aa2203bb460f99b2fb2845a493ea541e7f3b69bd0463a81d-rootfs.mount: Deactivated successfully. Dec 16 12:19:13.716323 systemd[1]: cri-containerd-7082627ca7b9b57089caf7086f20d7bf4f72746902a62d7e893afeb78ceb58b5.scope: Deactivated successfully. Dec 16 12:19:13.716644 systemd[1]: cri-containerd-7082627ca7b9b57089caf7086f20d7bf4f72746902a62d7e893afeb78ceb58b5.scope: Consumed 35.652s CPU time, 98.6M memory peak. Dec 16 12:19:13.717576 containerd[1680]: time="2025-12-16T12:19:13.717543325Z" level=info msg="received container exit event container_id:\"7082627ca7b9b57089caf7086f20d7bf4f72746902a62d7e893afeb78ceb58b5\" id:\"7082627ca7b9b57089caf7086f20d7bf4f72746902a62d7e893afeb78ceb58b5\" pid:3208 exit_status:1 exited_at:{seconds:1765887553 nanos:717301884}" Dec 16 12:19:13.724000 audit: BPF prog-id=146 op=UNLOAD Dec 16 12:19:13.724000 audit: BPF prog-id=150 op=UNLOAD Dec 16 12:19:13.727170 kernel: audit: type=1334 audit(1765887553.724:918): prog-id=146 op=UNLOAD Dec 16 12:19:13.727241 kernel: audit: type=1334 audit(1765887553.724:919): prog-id=150 op=UNLOAD Dec 16 12:19:13.737588 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7082627ca7b9b57089caf7086f20d7bf4f72746902a62d7e893afeb78ceb58b5-rootfs.mount: Deactivated successfully. Dec 16 12:19:14.217527 kubelet[2884]: I1216 12:19:14.217493 2884 scope.go:117] "RemoveContainer" containerID="38b102d742cb82f5aa2203bb460f99b2fb2845a493ea541e7f3b69bd0463a81d" Dec 16 12:19:14.219258 containerd[1680]: time="2025-12-16T12:19:14.219225684Z" level=info msg="CreateContainer within sandbox \"2beb1286b819acf7a828d239636218ffcac03b1501450b56c2487ee12c124450\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Dec 16 12:19:14.219541 kubelet[2884]: I1216 12:19:14.219470 2884 scope.go:117] "RemoveContainer" containerID="8c9b7bee1f4cd81264f37cbbccb08bb0f37766a189d348586ee855de89576a4a" Dec 16 12:19:14.222626 containerd[1680]: time="2025-12-16T12:19:14.222588141Z" level=info msg="CreateContainer within sandbox \"b75041d8b8aeb6494062d9b601023d0c6be9abefbd30f006b4bd2096ce9786dd\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Dec 16 12:19:14.226620 kubelet[2884]: I1216 12:19:14.226597 2884 scope.go:117] "RemoveContainer" containerID="7082627ca7b9b57089caf7086f20d7bf4f72746902a62d7e893afeb78ceb58b5" Dec 16 12:19:14.228749 containerd[1680]: time="2025-12-16T12:19:14.228713332Z" level=info msg="CreateContainer within sandbox \"9b35ee0ade21bcf96619a0b05464c3e178eef1b8c3e1f9cefebcf6f4c997e6eb\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Dec 16 12:19:14.240241 containerd[1680]: time="2025-12-16T12:19:14.240198471Z" level=info msg="Container 2eba6386988e0ede9507e01f1bb70cd77ac0063f33410c55e2c88de31677c54e: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:19:14.245027 containerd[1680]: time="2025-12-16T12:19:14.244738494Z" level=info msg="Container c960ae3f3562e1ca382bdab402500af7831db294b23bfe5da46bec5123e68263: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:19:14.249995 containerd[1680]: time="2025-12-16T12:19:14.249842160Z" level=info msg="Container 4b2e66619af7daabe4378b547bc9c9db42f20fb67bdc8b3e4319f44f2fba284c: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:19:14.255732 containerd[1680]: time="2025-12-16T12:19:14.255676190Z" level=info msg="CreateContainer within sandbox \"2beb1286b819acf7a828d239636218ffcac03b1501450b56c2487ee12c124450\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"2eba6386988e0ede9507e01f1bb70cd77ac0063f33410c55e2c88de31677c54e\"" Dec 16 12:19:14.256262 containerd[1680]: time="2025-12-16T12:19:14.256236593Z" level=info msg="StartContainer for \"2eba6386988e0ede9507e01f1bb70cd77ac0063f33410c55e2c88de31677c54e\"" Dec 16 12:19:14.257351 containerd[1680]: time="2025-12-16T12:19:14.257320918Z" level=info msg="connecting to shim 2eba6386988e0ede9507e01f1bb70cd77ac0063f33410c55e2c88de31677c54e" address="unix:///run/containerd/s/ef72796d5aeaa955ed3915aefdb00d0b2ae28953dbc3e58e0dae29faf0751789" protocol=ttrpc version=3 Dec 16 12:19:14.258934 containerd[1680]: time="2025-12-16T12:19:14.258878566Z" level=info msg="CreateContainer within sandbox \"9b35ee0ade21bcf96619a0b05464c3e178eef1b8c3e1f9cefebcf6f4c997e6eb\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"4b2e66619af7daabe4378b547bc9c9db42f20fb67bdc8b3e4319f44f2fba284c\"" Dec 16 12:19:14.259781 containerd[1680]: time="2025-12-16T12:19:14.259418409Z" level=info msg="StartContainer for \"4b2e66619af7daabe4378b547bc9c9db42f20fb67bdc8b3e4319f44f2fba284c\"" Dec 16 12:19:14.259955 containerd[1680]: time="2025-12-16T12:19:14.259908731Z" level=info msg="CreateContainer within sandbox \"b75041d8b8aeb6494062d9b601023d0c6be9abefbd30f006b4bd2096ce9786dd\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"c960ae3f3562e1ca382bdab402500af7831db294b23bfe5da46bec5123e68263\"" Dec 16 12:19:14.260370 containerd[1680]: time="2025-12-16T12:19:14.260203453Z" level=info msg="connecting to shim 4b2e66619af7daabe4378b547bc9c9db42f20fb67bdc8b3e4319f44f2fba284c" address="unix:///run/containerd/s/507b8e916ef734dd03db135afa76061a073fa61b5ca391837baf8835da4b438b" protocol=ttrpc version=3 Dec 16 12:19:14.260522 containerd[1680]: time="2025-12-16T12:19:14.260273293Z" level=info msg="StartContainer for \"c960ae3f3562e1ca382bdab402500af7831db294b23bfe5da46bec5123e68263\"" Dec 16 12:19:14.261780 containerd[1680]: time="2025-12-16T12:19:14.261748941Z" level=info msg="connecting to shim c960ae3f3562e1ca382bdab402500af7831db294b23bfe5da46bec5123e68263" address="unix:///run/containerd/s/6709cd05166b606da57e33e60d9bee3af2a77f614e3426936b371827527b26fa" protocol=ttrpc version=3 Dec 16 12:19:14.289522 systemd[1]: Started cri-containerd-4b2e66619af7daabe4378b547bc9c9db42f20fb67bdc8b3e4319f44f2fba284c.scope - libcontainer container 4b2e66619af7daabe4378b547bc9c9db42f20fb67bdc8b3e4319f44f2fba284c. Dec 16 12:19:14.293669 systemd[1]: Started cri-containerd-2eba6386988e0ede9507e01f1bb70cd77ac0063f33410c55e2c88de31677c54e.scope - libcontainer container 2eba6386988e0ede9507e01f1bb70cd77ac0063f33410c55e2c88de31677c54e. Dec 16 12:19:14.294758 systemd[1]: Started cri-containerd-c960ae3f3562e1ca382bdab402500af7831db294b23bfe5da46bec5123e68263.scope - libcontainer container c960ae3f3562e1ca382bdab402500af7831db294b23bfe5da46bec5123e68263. Dec 16 12:19:14.302000 audit: BPF prog-id=263 op=LOAD Dec 16 12:19:14.303000 audit: BPF prog-id=264 op=LOAD Dec 16 12:19:14.303000 audit[5746]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=3000 pid=5746 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:14.303000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3462326536363631396166376461616265343337386235343762633963 Dec 16 12:19:14.303000 audit: BPF prog-id=264 op=UNLOAD Dec 16 12:19:14.303000 audit[5746]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3000 pid=5746 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:14.303000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3462326536363631396166376461616265343337386235343762633963 Dec 16 12:19:14.303000 audit: BPF prog-id=265 op=LOAD Dec 16 12:19:14.303000 audit[5746]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=3000 pid=5746 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:14.303000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3462326536363631396166376461616265343337386235343762633963 Dec 16 12:19:14.303000 audit: BPF prog-id=266 op=LOAD Dec 16 12:19:14.303000 audit[5746]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=3000 pid=5746 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:14.303000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3462326536363631396166376461616265343337386235343762633963 Dec 16 12:19:14.303000 audit: BPF prog-id=266 op=UNLOAD Dec 16 12:19:14.303000 audit[5746]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3000 pid=5746 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:14.303000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3462326536363631396166376461616265343337386235343762633963 Dec 16 12:19:14.303000 audit: BPF prog-id=265 op=UNLOAD Dec 16 12:19:14.303000 audit[5746]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3000 pid=5746 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:14.303000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3462326536363631396166376461616265343337386235343762633963 Dec 16 12:19:14.303000 audit: BPF prog-id=267 op=LOAD Dec 16 12:19:14.303000 audit[5746]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=3000 pid=5746 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:14.303000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3462326536363631396166376461616265343337386235343762633963 Dec 16 12:19:14.305000 audit: BPF prog-id=268 op=LOAD Dec 16 12:19:14.306000 audit: BPF prog-id=269 op=LOAD Dec 16 12:19:14.306000 audit[5745]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2596 pid=5745 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:14.306000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265626136333836393838653065646539353037653031663162623730 Dec 16 12:19:14.306000 audit: BPF prog-id=269 op=UNLOAD Dec 16 12:19:14.306000 audit[5745]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2596 pid=5745 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:14.306000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265626136333836393838653065646539353037653031663162623730 Dec 16 12:19:14.306000 audit: BPF prog-id=270 op=LOAD Dec 16 12:19:14.306000 audit[5745]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2596 pid=5745 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:14.306000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265626136333836393838653065646539353037653031663162623730 Dec 16 12:19:14.306000 audit: BPF prog-id=271 op=LOAD Dec 16 12:19:14.306000 audit[5745]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2596 pid=5745 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:14.306000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265626136333836393838653065646539353037653031663162623730 Dec 16 12:19:14.306000 audit: BPF prog-id=271 op=UNLOAD Dec 16 12:19:14.306000 audit[5745]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2596 pid=5745 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:14.306000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265626136333836393838653065646539353037653031663162623730 Dec 16 12:19:14.306000 audit: BPF prog-id=270 op=UNLOAD Dec 16 12:19:14.306000 audit[5745]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2596 pid=5745 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:14.306000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265626136333836393838653065646539353037653031663162623730 Dec 16 12:19:14.306000 audit: BPF prog-id=272 op=LOAD Dec 16 12:19:14.306000 audit[5745]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2596 pid=5745 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:14.306000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265626136333836393838653065646539353037653031663162623730 Dec 16 12:19:14.307000 audit: BPF prog-id=273 op=LOAD Dec 16 12:19:14.307000 audit: BPF prog-id=274 op=LOAD Dec 16 12:19:14.307000 audit[5748]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2577 pid=5748 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:14.307000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339363061653366333536326531636133383262646162343032353030 Dec 16 12:19:14.307000 audit: BPF prog-id=274 op=UNLOAD Dec 16 12:19:14.307000 audit[5748]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2577 pid=5748 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:14.307000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339363061653366333536326531636133383262646162343032353030 Dec 16 12:19:14.308000 audit: BPF prog-id=275 op=LOAD Dec 16 12:19:14.308000 audit[5748]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2577 pid=5748 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:14.308000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339363061653366333536326531636133383262646162343032353030 Dec 16 12:19:14.308000 audit: BPF prog-id=276 op=LOAD Dec 16 12:19:14.308000 audit[5748]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2577 pid=5748 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:14.308000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339363061653366333536326531636133383262646162343032353030 Dec 16 12:19:14.308000 audit: BPF prog-id=276 op=UNLOAD Dec 16 12:19:14.308000 audit[5748]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2577 pid=5748 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:14.308000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339363061653366333536326531636133383262646162343032353030 Dec 16 12:19:14.308000 audit: BPF prog-id=275 op=UNLOAD Dec 16 12:19:14.308000 audit[5748]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2577 pid=5748 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:14.308000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339363061653366333536326531636133383262646162343032353030 Dec 16 12:19:14.308000 audit: BPF prog-id=277 op=LOAD Dec 16 12:19:14.308000 audit[5748]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2577 pid=5748 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:19:14.308000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339363061653366333536326531636133383262646162343032353030 Dec 16 12:19:14.344001 containerd[1680]: time="2025-12-16T12:19:14.343956400Z" level=info msg="StartContainer for \"c960ae3f3562e1ca382bdab402500af7831db294b23bfe5da46bec5123e68263\" returns successfully" Dec 16 12:19:14.344682 containerd[1680]: time="2025-12-16T12:19:14.344649444Z" level=info msg="StartContainer for \"4b2e66619af7daabe4378b547bc9c9db42f20fb67bdc8b3e4319f44f2fba284c\" returns successfully" Dec 16 12:19:14.344875 containerd[1680]: time="2025-12-16T12:19:14.344836885Z" level=info msg="StartContainer for \"2eba6386988e0ede9507e01f1bb70cd77ac0063f33410c55e2c88de31677c54e\" returns successfully" Dec 16 12:19:15.086383 kubelet[2884]: I1216 12:19:15.086332 2884 status_manager.go:890] "Failed to get status for pod" podUID="9883af1d-f9ff-4212-ac38-34ecc575631c" pod="calico-system/whisker-69fc96cf55-8lskp" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.21.180:58918->10.0.21.190:2379: read: connection timed out" Dec 16 12:19:15.650057 kubelet[2884]: E1216 12:19:15.650007 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-sbbxz" podUID="13366e02-1117-46c3-a880-8d8cc6c423f8" Dec 16 12:19:15.653463 containerd[1680]: time="2025-12-16T12:19:15.653419438Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:19:15.983102 containerd[1680]: time="2025-12-16T12:19:15.983036479Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:15.984483 containerd[1680]: time="2025-12-16T12:19:15.984423486Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:19:15.984557 containerd[1680]: time="2025-12-16T12:19:15.984477526Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:15.984714 kubelet[2884]: E1216 12:19:15.984641 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:19:15.984801 kubelet[2884]: E1216 12:19:15.984719 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:19:15.984874 kubelet[2884]: E1216 12:19:15.984834 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qbzzt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-2hkqn_calico-system(504cf836-455c-42a5-8d68-245e5d4890cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:15.987739 containerd[1680]: time="2025-12-16T12:19:15.987473342Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:19:16.312346 containerd[1680]: time="2025-12-16T12:19:16.311966917Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:16.313261 containerd[1680]: time="2025-12-16T12:19:16.313218883Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:19:16.313386 containerd[1680]: time="2025-12-16T12:19:16.313255403Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:16.313594 kubelet[2884]: E1216 12:19:16.313533 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:19:16.314176 kubelet[2884]: E1216 12:19:16.313921 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:19:16.314286 kubelet[2884]: E1216 12:19:16.314139 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qbzzt,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-2hkqn_calico-system(504cf836-455c-42a5-8d68-245e5d4890cf): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:16.315440 kubelet[2884]: E1216 12:19:16.315405 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-2hkqn" podUID="504cf836-455c-42a5-8d68-245e5d4890cf" Dec 16 12:19:16.684230 kubelet[2884]: E1216 12:19:16.684107 2884 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.21.180:58814->10.0.21.190:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4547-0-0-0-5b424f63c8.1881b15e03734dd4 kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4547-0-0-0-5b424f63c8,UID:f06bec92f4e666224e51bdee5ebb29cd,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4547-0-0-0-5b424f63c8,},FirstTimestamp:2025-12-16 12:19:06.248875476 +0000 UTC m=+225.867868936,LastTimestamp:2025-12-16 12:19:06.248875476 +0000 UTC m=+225.867868936,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4547-0-0-0-5b424f63c8,}" Dec 16 12:19:18.649505 containerd[1680]: time="2025-12-16T12:19:18.649463958Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:19:19.019659 containerd[1680]: time="2025-12-16T12:19:19.019547285Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:19:19.021131 containerd[1680]: time="2025-12-16T12:19:19.021061773Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:19:19.021212 containerd[1680]: time="2025-12-16T12:19:19.021138573Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:19:19.021349 kubelet[2884]: E1216 12:19:19.021294 2884 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:19:19.021349 kubelet[2884]: E1216 12:19:19.021343 2884 kuberuntime_image.go:55] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:19:19.021922 kubelet[2884]: E1216 12:19:19.021459 2884 kuberuntime_manager.go:1341] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ptdjf,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-6b9b8c464c-4jgjb_calico-system(384a06af-c494-40e9-b0a8-31b5c5a33ae4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:19:19.022670 kubelet[2884]: E1216 12:19:19.022637 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b9b8c464c-4jgjb" podUID="384a06af-c494-40e9-b0a8-31b5c5a33ae4" Dec 16 12:19:19.650347 kubelet[2884]: E1216 12:19:19.650301 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-69fc96cf55-8lskp" podUID="9883af1d-f9ff-4212-ac38-34ecc575631c" Dec 16 12:19:20.649872 kubelet[2884]: E1216 12:19:20.649823 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9bb48c66-mjzv6" podUID="576d8526-5af6-453c-afc4-7ebd613c4146" Dec 16 12:19:21.649264 kubelet[2884]: E1216 12:19:21.649210 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v5bnx" podUID="896c3574-3482-4970-a592-5c7752aa620e" Dec 16 12:19:21.649436 kubelet[2884]: E1216 12:19:21.649264 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-5nkhx" podUID="3cfa8afe-d370-4d42-b9ea-f53cfd764b71" Dec 16 12:19:23.357722 kubelet[2884]: E1216 12:19:23.357650 2884 controller.go:195] "Failed to update lease" err="Put \"https://10.0.21.180:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-0-5b424f63c8?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 12:19:25.580404 systemd[1]: cri-containerd-4b2e66619af7daabe4378b547bc9c9db42f20fb67bdc8b3e4319f44f2fba284c.scope: Deactivated successfully. Dec 16 12:19:25.581148 containerd[1680]: time="2025-12-16T12:19:25.580714466Z" level=info msg="received container exit event container_id:\"4b2e66619af7daabe4378b547bc9c9db42f20fb67bdc8b3e4319f44f2fba284c\" id:\"4b2e66619af7daabe4378b547bc9c9db42f20fb67bdc8b3e4319f44f2fba284c\" pid:5783 exit_status:1 exited_at:{seconds:1765887565 nanos:580527545}" Dec 16 12:19:25.584000 audit: BPF prog-id=263 op=UNLOAD Dec 16 12:19:25.586626 kernel: kauditd_printk_skb: 66 callbacks suppressed Dec 16 12:19:25.586695 kernel: audit: type=1334 audit(1765887565.584:944): prog-id=263 op=UNLOAD Dec 16 12:19:25.586716 kernel: audit: type=1334 audit(1765887565.584:945): prog-id=267 op=UNLOAD Dec 16 12:19:25.584000 audit: BPF prog-id=267 op=UNLOAD Dec 16 12:19:25.600184 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-4b2e66619af7daabe4378b547bc9c9db42f20fb67bdc8b3e4319f44f2fba284c-rootfs.mount: Deactivated successfully. Dec 16 12:19:26.263666 kubelet[2884]: I1216 12:19:26.263602 2884 scope.go:117] "RemoveContainer" containerID="7082627ca7b9b57089caf7086f20d7bf4f72746902a62d7e893afeb78ceb58b5" Dec 16 12:19:26.264497 kubelet[2884]: I1216 12:19:26.263839 2884 scope.go:117] "RemoveContainer" containerID="4b2e66619af7daabe4378b547bc9c9db42f20fb67bdc8b3e4319f44f2fba284c" Dec 16 12:19:26.264497 kubelet[2884]: E1216 12:19:26.264002 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-7dcd859c48-8ccrj_tigera-operator(2adc4df3-4f5c-4709-800e-2d961848e0d0)\"" pod="tigera-operator/tigera-operator-7dcd859c48-8ccrj" podUID="2adc4df3-4f5c-4709-800e-2d961848e0d0" Dec 16 12:19:26.265525 containerd[1680]: time="2025-12-16T12:19:26.265493998Z" level=info msg="RemoveContainer for \"7082627ca7b9b57089caf7086f20d7bf4f72746902a62d7e893afeb78ceb58b5\"" Dec 16 12:19:26.272895 containerd[1680]: time="2025-12-16T12:19:26.272858756Z" level=info msg="RemoveContainer for \"7082627ca7b9b57089caf7086f20d7bf4f72746902a62d7e893afeb78ceb58b5\" returns successfully" Dec 16 12:19:26.649890 kubelet[2884]: E1216 12:19:26.649778 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-sbbxz" podUID="13366e02-1117-46c3-a880-8d8cc6c423f8" Dec 16 12:19:29.649871 kubelet[2884]: E1216 12:19:29.649807 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-2hkqn" podUID="504cf836-455c-42a5-8d68-245e5d4890cf" Dec 16 12:19:31.649820 kubelet[2884]: E1216 12:19:31.649764 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-9bb48c66-mjzv6" podUID="576d8526-5af6-453c-afc4-7ebd613c4146" Dec 16 12:19:32.654676 kubelet[2884]: E1216 12:19:32.654623 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b9b8c464c-4jgjb" podUID="384a06af-c494-40e9-b0a8-31b5c5a33ae4" Dec 16 12:19:33.358459 kubelet[2884]: E1216 12:19:33.358352 2884 controller.go:195] "Failed to update lease" err="Put \"https://10.0.21.180:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-0-5b424f63c8?timeout=10s\": net/http: request canceled (Client.Timeout exceeded while awaiting headers)" Dec 16 12:19:33.649604 kubelet[2884]: E1216 12:19:33.649488 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-6784c79f67-5nkhx" podUID="3cfa8afe-d370-4d42-b9ea-f53cfd764b71" Dec 16 12:19:33.650000 kubelet[2884]: E1216 12:19:33.649914 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-69fc96cf55-8lskp" podUID="9883af1d-f9ff-4212-ac38-34ecc575631c" Dec 16 12:19:35.649381 kubelet[2884]: E1216 12:19:35.649300 2884 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-v5bnx" podUID="896c3574-3482-4970-a592-5c7752aa620e"