Dec 16 12:11:10.392596 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Dec 16 12:11:10.392621 kernel: Linux version 6.12.61-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT Tue Dec 16 00:05:24 -00 2025 Dec 16 12:11:10.392648 kernel: KASLR enabled Dec 16 12:11:10.392656 kernel: efi: EFI v2.7 by EDK II Dec 16 12:11:10.392664 kernel: efi: SMBIOS 3.0=0x43bed0000 MEMATTR=0x43a714018 ACPI 2.0=0x438430018 RNG=0x43843e818 MEMRESERVE=0x438351218 Dec 16 12:11:10.392670 kernel: random: crng init done Dec 16 12:11:10.392677 kernel: secureboot: Secure boot disabled Dec 16 12:11:10.392684 kernel: ACPI: Early table checksum verification disabled Dec 16 12:11:10.392690 kernel: ACPI: RSDP 0x0000000438430018 000024 (v02 BOCHS ) Dec 16 12:11:10.392699 kernel: ACPI: XSDT 0x000000043843FE98 000074 (v01 BOCHS BXPC 00000001 01000013) Dec 16 12:11:10.392706 kernel: ACPI: FACP 0x000000043843FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:11:10.392712 kernel: ACPI: DSDT 0x0000000438437518 0014A2 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:11:10.392718 kernel: ACPI: APIC 0x000000043843FC18 0001A8 (v04 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:11:10.392725 kernel: ACPI: PPTT 0x000000043843D898 000114 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:11:10.392734 kernel: ACPI: GTDT 0x000000043843E898 000068 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:11:10.392741 kernel: ACPI: MCFG 0x000000043843FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:11:10.392748 kernel: ACPI: SPCR 0x000000043843E498 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:11:10.392755 kernel: ACPI: DBG2 0x000000043843E798 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:11:10.392762 kernel: ACPI: SRAT 0x000000043843E518 0000A0 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:11:10.392769 kernel: ACPI: IORT 0x000000043843E618 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:11:10.392776 kernel: ACPI: BGRT 0x000000043843E718 000038 (v01 INTEL EDK2 00000002 01000013) Dec 16 12:11:10.392783 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Dec 16 12:11:10.392789 kernel: ACPI: Use ACPI SPCR as default console: Yes Dec 16 12:11:10.392798 kernel: ACPI: SRAT: Node 0 PXM 0 [mem 0x40000000-0x43fffffff] Dec 16 12:11:10.392805 kernel: NODE_DATA(0) allocated [mem 0x43dff1a00-0x43dff8fff] Dec 16 12:11:10.392811 kernel: Zone ranges: Dec 16 12:11:10.392818 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Dec 16 12:11:10.392825 kernel: DMA32 empty Dec 16 12:11:10.392832 kernel: Normal [mem 0x0000000100000000-0x000000043fffffff] Dec 16 12:11:10.392839 kernel: Device empty Dec 16 12:11:10.392845 kernel: Movable zone start for each node Dec 16 12:11:10.392852 kernel: Early memory node ranges Dec 16 12:11:10.392859 kernel: node 0: [mem 0x0000000040000000-0x000000043843ffff] Dec 16 12:11:10.392866 kernel: node 0: [mem 0x0000000438440000-0x000000043872ffff] Dec 16 12:11:10.392873 kernel: node 0: [mem 0x0000000438730000-0x000000043bbfffff] Dec 16 12:11:10.392881 kernel: node 0: [mem 0x000000043bc00000-0x000000043bfdffff] Dec 16 12:11:10.392888 kernel: node 0: [mem 0x000000043bfe0000-0x000000043fffffff] Dec 16 12:11:10.392895 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x000000043fffffff] Dec 16 12:11:10.392902 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Dec 16 12:11:10.392909 kernel: psci: probing for conduit method from ACPI. Dec 16 12:11:10.392918 kernel: psci: PSCIv1.3 detected in firmware. Dec 16 12:11:10.392927 kernel: psci: Using standard PSCI v0.2 function IDs Dec 16 12:11:10.392934 kernel: psci: Trusted OS migration not required Dec 16 12:11:10.392941 kernel: psci: SMC Calling Convention v1.1 Dec 16 12:11:10.392949 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Dec 16 12:11:10.392956 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x0 -> Node 0 Dec 16 12:11:10.392963 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x1 -> Node 0 Dec 16 12:11:10.392970 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x2 -> Node 0 Dec 16 12:11:10.392978 kernel: ACPI: NUMA: SRAT: PXM 0 -> MPIDR 0x3 -> Node 0 Dec 16 12:11:10.392986 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Dec 16 12:11:10.392993 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Dec 16 12:11:10.393001 kernel: pcpu-alloc: [0] 0 [0] 1 [0] 2 [0] 3 Dec 16 12:11:10.393008 kernel: Detected PIPT I-cache on CPU0 Dec 16 12:11:10.393016 kernel: CPU features: detected: GIC system register CPU interface Dec 16 12:11:10.393023 kernel: CPU features: detected: Spectre-v4 Dec 16 12:11:10.393030 kernel: CPU features: detected: Spectre-BHB Dec 16 12:11:10.393038 kernel: CPU features: kernel page table isolation forced ON by KASLR Dec 16 12:11:10.393045 kernel: CPU features: detected: Kernel page table isolation (KPTI) Dec 16 12:11:10.393052 kernel: CPU features: detected: ARM erratum 1418040 Dec 16 12:11:10.393059 kernel: CPU features: detected: SSBS not fully self-synchronizing Dec 16 12:11:10.393068 kernel: alternatives: applying boot alternatives Dec 16 12:11:10.393077 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=756b815c2fd7ac2947efceb2a88878d1ea9723ec85037c2b4d1a09bd798bb749 Dec 16 12:11:10.393084 kernel: Dentry cache hash table entries: 2097152 (order: 12, 16777216 bytes, linear) Dec 16 12:11:10.393092 kernel: Inode-cache hash table entries: 1048576 (order: 11, 8388608 bytes, linear) Dec 16 12:11:10.393099 kernel: Fallback order for Node 0: 0 Dec 16 12:11:10.393106 kernel: Built 1 zonelists, mobility grouping on. Total pages: 4194304 Dec 16 12:11:10.393114 kernel: Policy zone: Normal Dec 16 12:11:10.393121 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Dec 16 12:11:10.393128 kernel: software IO TLB: area num 4. Dec 16 12:11:10.393135 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Dec 16 12:11:10.393144 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=4, Nodes=1 Dec 16 12:11:10.393151 kernel: rcu: Preemptible hierarchical RCU implementation. Dec 16 12:11:10.393159 kernel: rcu: RCU event tracing is enabled. Dec 16 12:11:10.393167 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=4. Dec 16 12:11:10.393174 kernel: Trampoline variant of Tasks RCU enabled. Dec 16 12:11:10.393182 kernel: Tracing variant of Tasks RCU enabled. Dec 16 12:11:10.393189 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Dec 16 12:11:10.393196 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=4 Dec 16 12:11:10.393204 kernel: RCU Tasks: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Dec 16 12:11:10.393211 kernel: RCU Tasks Trace: Setting shift to 2 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=4. Dec 16 12:11:10.393218 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Dec 16 12:11:10.393227 kernel: GICv3: 256 SPIs implemented Dec 16 12:11:10.393234 kernel: GICv3: 0 Extended SPIs implemented Dec 16 12:11:10.393242 kernel: Root IRQ handler: gic_handle_irq Dec 16 12:11:10.393249 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Dec 16 12:11:10.393256 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Dec 16 12:11:10.393263 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Dec 16 12:11:10.393270 kernel: ITS [mem 0x08080000-0x0809ffff] Dec 16 12:11:10.393278 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100110000 (indirect, esz 8, psz 64K, shr 1) Dec 16 12:11:10.393285 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100120000 (flat, esz 8, psz 64K, shr 1) Dec 16 12:11:10.393293 kernel: GICv3: using LPI property table @0x0000000100130000 Dec 16 12:11:10.393300 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100140000 Dec 16 12:11:10.393307 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Dec 16 12:11:10.393316 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 16 12:11:10.393323 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Dec 16 12:11:10.393331 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Dec 16 12:11:10.393338 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Dec 16 12:11:10.393346 kernel: arm-pv: using stolen time PV Dec 16 12:11:10.393354 kernel: Console: colour dummy device 80x25 Dec 16 12:11:10.393361 kernel: ACPI: Core revision 20240827 Dec 16 12:11:10.393369 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Dec 16 12:11:10.393379 kernel: pid_max: default: 32768 minimum: 301 Dec 16 12:11:10.393386 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Dec 16 12:11:10.393394 kernel: landlock: Up and running. Dec 16 12:11:10.393401 kernel: SELinux: Initializing. Dec 16 12:11:10.393409 kernel: Mount-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 16 12:11:10.393417 kernel: Mountpoint-cache hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 16 12:11:10.393425 kernel: rcu: Hierarchical SRCU implementation. Dec 16 12:11:10.393432 kernel: rcu: Max phase no-delay instances is 400. Dec 16 12:11:10.393442 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Dec 16 12:11:10.393449 kernel: Remapping and enabling EFI services. Dec 16 12:11:10.393462 kernel: smp: Bringing up secondary CPUs ... Dec 16 12:11:10.393470 kernel: Detected PIPT I-cache on CPU1 Dec 16 12:11:10.393478 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Dec 16 12:11:10.393486 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100150000 Dec 16 12:11:10.393494 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 16 12:11:10.393503 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Dec 16 12:11:10.393511 kernel: Detected PIPT I-cache on CPU2 Dec 16 12:11:10.393524 kernel: GICv3: CPU2: found redistributor 2 region 0:0x00000000080e0000 Dec 16 12:11:10.393533 kernel: GICv3: CPU2: using allocated LPI pending table @0x0000000100160000 Dec 16 12:11:10.393542 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 16 12:11:10.393550 kernel: CPU2: Booted secondary processor 0x0000000002 [0x413fd0c1] Dec 16 12:11:10.393558 kernel: Detected PIPT I-cache on CPU3 Dec 16 12:11:10.393566 kernel: GICv3: CPU3: found redistributor 3 region 0:0x0000000008100000 Dec 16 12:11:10.393576 kernel: GICv3: CPU3: using allocated LPI pending table @0x0000000100170000 Dec 16 12:11:10.393584 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 16 12:11:10.393592 kernel: CPU3: Booted secondary processor 0x0000000003 [0x413fd0c1] Dec 16 12:11:10.393600 kernel: smp: Brought up 1 node, 4 CPUs Dec 16 12:11:10.393608 kernel: SMP: Total of 4 processors activated. Dec 16 12:11:10.393617 kernel: CPU: All CPU(s) started at EL1 Dec 16 12:11:10.393651 kernel: CPU features: detected: 32-bit EL0 Support Dec 16 12:11:10.393661 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Dec 16 12:11:10.393670 kernel: CPU features: detected: Common not Private translations Dec 16 12:11:10.393678 kernel: CPU features: detected: CRC32 instructions Dec 16 12:11:10.393686 kernel: CPU features: detected: Enhanced Virtualization Traps Dec 16 12:11:10.393694 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Dec 16 12:11:10.393703 kernel: CPU features: detected: LSE atomic instructions Dec 16 12:11:10.393713 kernel: CPU features: detected: Privileged Access Never Dec 16 12:11:10.393721 kernel: CPU features: detected: RAS Extension Support Dec 16 12:11:10.393729 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Dec 16 12:11:10.393737 kernel: alternatives: applying system-wide alternatives Dec 16 12:11:10.393745 kernel: CPU features: detected: Hardware dirty bit management on CPU0-3 Dec 16 12:11:10.393754 kernel: Memory: 16324432K/16777216K available (11200K kernel code, 2456K rwdata, 9084K rodata, 12480K init, 1038K bss, 430000K reserved, 16384K cma-reserved) Dec 16 12:11:10.393762 kernel: devtmpfs: initialized Dec 16 12:11:10.393772 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Dec 16 12:11:10.393780 kernel: futex hash table entries: 1024 (order: 4, 65536 bytes, linear) Dec 16 12:11:10.393788 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Dec 16 12:11:10.393796 kernel: 0 pages in range for non-PLT usage Dec 16 12:11:10.393804 kernel: 515168 pages in range for PLT usage Dec 16 12:11:10.393812 kernel: pinctrl core: initialized pinctrl subsystem Dec 16 12:11:10.393820 kernel: SMBIOS 3.0.0 present. Dec 16 12:11:10.393828 kernel: DMI: QEMU KVM Virtual Machine, BIOS 0.0.0 02/06/2015 Dec 16 12:11:10.393837 kernel: DMI: Memory slots populated: 1/1 Dec 16 12:11:10.393845 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Dec 16 12:11:10.393853 kernel: DMA: preallocated 2048 KiB GFP_KERNEL pool for atomic allocations Dec 16 12:11:10.393861 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Dec 16 12:11:10.393870 kernel: DMA: preallocated 2048 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Dec 16 12:11:10.393878 kernel: audit: initializing netlink subsys (disabled) Dec 16 12:11:10.393886 kernel: audit: type=2000 audit(0.036:1): state=initialized audit_enabled=0 res=1 Dec 16 12:11:10.393895 kernel: thermal_sys: Registered thermal governor 'step_wise' Dec 16 12:11:10.393904 kernel: cpuidle: using governor menu Dec 16 12:11:10.393912 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Dec 16 12:11:10.393920 kernel: ASID allocator initialised with 32768 entries Dec 16 12:11:10.393928 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Dec 16 12:11:10.393936 kernel: Serial: AMBA PL011 UART driver Dec 16 12:11:10.393944 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Dec 16 12:11:10.393954 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Dec 16 12:11:10.393962 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Dec 16 12:11:10.393970 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Dec 16 12:11:10.393978 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Dec 16 12:11:10.393986 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Dec 16 12:11:10.393994 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Dec 16 12:11:10.394003 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Dec 16 12:11:10.394010 kernel: ACPI: Added _OSI(Module Device) Dec 16 12:11:10.394020 kernel: ACPI: Added _OSI(Processor Device) Dec 16 12:11:10.394028 kernel: ACPI: Added _OSI(Processor Aggregator Device) Dec 16 12:11:10.394037 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Dec 16 12:11:10.394045 kernel: ACPI: Interpreter enabled Dec 16 12:11:10.394053 kernel: ACPI: Using GIC for interrupt routing Dec 16 12:11:10.394061 kernel: ACPI: MCFG table detected, 1 entries Dec 16 12:11:10.394069 kernel: ACPI: CPU0 has been hot-added Dec 16 12:11:10.394078 kernel: ACPI: CPU1 has been hot-added Dec 16 12:11:10.394086 kernel: ACPI: CPU2 has been hot-added Dec 16 12:11:10.394094 kernel: ACPI: CPU3 has been hot-added Dec 16 12:11:10.394102 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Dec 16 12:11:10.394110 kernel: printk: legacy console [ttyAMA0] enabled Dec 16 12:11:10.394118 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Dec 16 12:11:10.394335 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Dec 16 12:11:10.394431 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Dec 16 12:11:10.394515 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Dec 16 12:11:10.394600 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Dec 16 12:11:10.394740 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Dec 16 12:11:10.394752 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Dec 16 12:11:10.394761 kernel: PCI host bridge to bus 0000:00 Dec 16 12:11:10.394853 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Dec 16 12:11:10.394933 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Dec 16 12:11:10.395011 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Dec 16 12:11:10.395097 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Dec 16 12:11:10.395213 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Dec 16 12:11:10.395318 kernel: pci 0000:00:01.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.395424 kernel: pci 0000:00:01.0: BAR 0 [mem 0x125a0000-0x125a0fff] Dec 16 12:11:10.395530 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Dec 16 12:11:10.395657 kernel: pci 0000:00:01.0: bridge window [mem 0x12400000-0x124fffff] Dec 16 12:11:10.395748 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Dec 16 12:11:10.395842 kernel: pci 0000:00:01.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.395934 kernel: pci 0000:00:01.1: BAR 0 [mem 0x1259f000-0x1259ffff] Dec 16 12:11:10.396019 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Dec 16 12:11:10.396102 kernel: pci 0000:00:01.1: bridge window [mem 0x12300000-0x123fffff] Dec 16 12:11:10.396192 kernel: pci 0000:00:01.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.396275 kernel: pci 0000:00:01.2: BAR 0 [mem 0x1259e000-0x1259efff] Dec 16 12:11:10.396360 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Dec 16 12:11:10.396442 kernel: pci 0000:00:01.2: bridge window [mem 0x12200000-0x122fffff] Dec 16 12:11:10.396524 kernel: pci 0000:00:01.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Dec 16 12:11:10.396614 kernel: pci 0000:00:01.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.396711 kernel: pci 0000:00:01.3: BAR 0 [mem 0x1259d000-0x1259dfff] Dec 16 12:11:10.396795 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Dec 16 12:11:10.396881 kernel: pci 0000:00:01.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Dec 16 12:11:10.396973 kernel: pci 0000:00:01.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.397057 kernel: pci 0000:00:01.4: BAR 0 [mem 0x1259c000-0x1259cfff] Dec 16 12:11:10.397140 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Dec 16 12:11:10.397222 kernel: pci 0000:00:01.4: bridge window [mem 0x12100000-0x121fffff] Dec 16 12:11:10.397320 kernel: pci 0000:00:01.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Dec 16 12:11:10.397418 kernel: pci 0000:00:01.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.397502 kernel: pci 0000:00:01.5: BAR 0 [mem 0x1259b000-0x1259bfff] Dec 16 12:11:10.397585 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Dec 16 12:11:10.397681 kernel: pci 0000:00:01.5: bridge window [mem 0x12000000-0x120fffff] Dec 16 12:11:10.397766 kernel: pci 0000:00:01.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Dec 16 12:11:10.397857 kernel: pci 0000:00:01.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.397944 kernel: pci 0000:00:01.6: BAR 0 [mem 0x1259a000-0x1259afff] Dec 16 12:11:10.398027 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Dec 16 12:11:10.398117 kernel: pci 0000:00:01.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.398199 kernel: pci 0000:00:01.7: BAR 0 [mem 0x12599000-0x12599fff] Dec 16 12:11:10.398282 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Dec 16 12:11:10.398372 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.398459 kernel: pci 0000:00:02.0: BAR 0 [mem 0x12598000-0x12598fff] Dec 16 12:11:10.398543 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Dec 16 12:11:10.398643 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.398732 kernel: pci 0000:00:02.1: BAR 0 [mem 0x12597000-0x12597fff] Dec 16 12:11:10.398820 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Dec 16 12:11:10.398909 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.398992 kernel: pci 0000:00:02.2: BAR 0 [mem 0x12596000-0x12596fff] Dec 16 12:11:10.399076 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Dec 16 12:11:10.399165 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.399248 kernel: pci 0000:00:02.3: BAR 0 [mem 0x12595000-0x12595fff] Dec 16 12:11:10.399358 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Dec 16 12:11:10.399453 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.399537 kernel: pci 0000:00:02.4: BAR 0 [mem 0x12594000-0x12594fff] Dec 16 12:11:10.399647 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Dec 16 12:11:10.399747 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.399833 kernel: pci 0000:00:02.5: BAR 0 [mem 0x12593000-0x12593fff] Dec 16 12:11:10.399922 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Dec 16 12:11:10.400026 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.400114 kernel: pci 0000:00:02.6: BAR 0 [mem 0x12592000-0x12592fff] Dec 16 12:11:10.400198 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Dec 16 12:11:10.400288 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.400374 kernel: pci 0000:00:02.7: BAR 0 [mem 0x12591000-0x12591fff] Dec 16 12:11:10.400456 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Dec 16 12:11:10.400545 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.400641 kernel: pci 0000:00:03.0: BAR 0 [mem 0x12590000-0x12590fff] Dec 16 12:11:10.400730 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Dec 16 12:11:10.400819 kernel: pci 0000:00:03.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.400907 kernel: pci 0000:00:03.1: BAR 0 [mem 0x1258f000-0x1258ffff] Dec 16 12:11:10.400995 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Dec 16 12:11:10.401079 kernel: pci 0000:00:03.1: bridge window [io 0xf000-0xffff] Dec 16 12:11:10.401164 kernel: pci 0000:00:03.1: bridge window [mem 0x11e00000-0x11ffffff] Dec 16 12:11:10.401253 kernel: pci 0000:00:03.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.401336 kernel: pci 0000:00:03.2: BAR 0 [mem 0x1258e000-0x1258efff] Dec 16 12:11:10.401421 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Dec 16 12:11:10.401504 kernel: pci 0000:00:03.2: bridge window [io 0xe000-0xefff] Dec 16 12:11:10.401587 kernel: pci 0000:00:03.2: bridge window [mem 0x11c00000-0x11dfffff] Dec 16 12:11:10.401697 kernel: pci 0000:00:03.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.401785 kernel: pci 0000:00:03.3: BAR 0 [mem 0x1258d000-0x1258dfff] Dec 16 12:11:10.401868 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Dec 16 12:11:10.401955 kernel: pci 0000:00:03.3: bridge window [io 0xd000-0xdfff] Dec 16 12:11:10.402040 kernel: pci 0000:00:03.3: bridge window [mem 0x11a00000-0x11bfffff] Dec 16 12:11:10.402165 kernel: pci 0000:00:03.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.402275 kernel: pci 0000:00:03.4: BAR 0 [mem 0x1258c000-0x1258cfff] Dec 16 12:11:10.402362 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Dec 16 12:11:10.402445 kernel: pci 0000:00:03.4: bridge window [io 0xc000-0xcfff] Dec 16 12:11:10.402532 kernel: pci 0000:00:03.4: bridge window [mem 0x11800000-0x119fffff] Dec 16 12:11:10.402623 kernel: pci 0000:00:03.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.402721 kernel: pci 0000:00:03.5: BAR 0 [mem 0x1258b000-0x1258bfff] Dec 16 12:11:10.402805 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Dec 16 12:11:10.402888 kernel: pci 0000:00:03.5: bridge window [io 0xb000-0xbfff] Dec 16 12:11:10.402971 kernel: pci 0000:00:03.5: bridge window [mem 0x11600000-0x117fffff] Dec 16 12:11:10.403090 kernel: pci 0000:00:03.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.403177 kernel: pci 0000:00:03.6: BAR 0 [mem 0x1258a000-0x1258afff] Dec 16 12:11:10.403262 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Dec 16 12:11:10.404181 kernel: pci 0000:00:03.6: bridge window [io 0xa000-0xafff] Dec 16 12:11:10.404432 kernel: pci 0000:00:03.6: bridge window [mem 0x11400000-0x115fffff] Dec 16 12:11:10.404530 kernel: pci 0000:00:03.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.404618 kernel: pci 0000:00:03.7: BAR 0 [mem 0x12589000-0x12589fff] Dec 16 12:11:10.404721 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Dec 16 12:11:10.404806 kernel: pci 0000:00:03.7: bridge window [io 0x9000-0x9fff] Dec 16 12:11:10.404893 kernel: pci 0000:00:03.7: bridge window [mem 0x11200000-0x113fffff] Dec 16 12:11:10.404984 kernel: pci 0000:00:04.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.405068 kernel: pci 0000:00:04.0: BAR 0 [mem 0x12588000-0x12588fff] Dec 16 12:11:10.405153 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Dec 16 12:11:10.405236 kernel: pci 0000:00:04.0: bridge window [io 0x8000-0x8fff] Dec 16 12:11:10.405318 kernel: pci 0000:00:04.0: bridge window [mem 0x11000000-0x111fffff] Dec 16 12:11:10.405409 kernel: pci 0000:00:04.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.405493 kernel: pci 0000:00:04.1: BAR 0 [mem 0x12587000-0x12587fff] Dec 16 12:11:10.405575 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Dec 16 12:11:10.405673 kernel: pci 0000:00:04.1: bridge window [io 0x7000-0x7fff] Dec 16 12:11:10.405758 kernel: pci 0000:00:04.1: bridge window [mem 0x10e00000-0x10ffffff] Dec 16 12:11:10.405849 kernel: pci 0000:00:04.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.405933 kernel: pci 0000:00:04.2: BAR 0 [mem 0x12586000-0x12586fff] Dec 16 12:11:10.406015 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Dec 16 12:11:10.406097 kernel: pci 0000:00:04.2: bridge window [io 0x6000-0x6fff] Dec 16 12:11:10.406183 kernel: pci 0000:00:04.2: bridge window [mem 0x10c00000-0x10dfffff] Dec 16 12:11:10.406274 kernel: pci 0000:00:04.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.406361 kernel: pci 0000:00:04.3: BAR 0 [mem 0x12585000-0x12585fff] Dec 16 12:11:10.406445 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Dec 16 12:11:10.406529 kernel: pci 0000:00:04.3: bridge window [io 0x5000-0x5fff] Dec 16 12:11:10.406611 kernel: pci 0000:00:04.3: bridge window [mem 0x10a00000-0x10bfffff] Dec 16 12:11:10.406714 kernel: pci 0000:00:04.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.406800 kernel: pci 0000:00:04.4: BAR 0 [mem 0x12584000-0x12584fff] Dec 16 12:11:10.406917 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Dec 16 12:11:10.407006 kernel: pci 0000:00:04.4: bridge window [io 0x4000-0x4fff] Dec 16 12:11:10.407089 kernel: pci 0000:00:04.4: bridge window [mem 0x10800000-0x109fffff] Dec 16 12:11:10.407179 kernel: pci 0000:00:04.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.407262 kernel: pci 0000:00:04.5: BAR 0 [mem 0x12583000-0x12583fff] Dec 16 12:11:10.407344 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Dec 16 12:11:10.407429 kernel: pci 0000:00:04.5: bridge window [io 0x3000-0x3fff] Dec 16 12:11:10.407516 kernel: pci 0000:00:04.5: bridge window [mem 0x10600000-0x107fffff] Dec 16 12:11:10.407616 kernel: pci 0000:00:04.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.407729 kernel: pci 0000:00:04.6: BAR 0 [mem 0x12582000-0x12582fff] Dec 16 12:11:10.407816 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Dec 16 12:11:10.407899 kernel: pci 0000:00:04.6: bridge window [io 0x2000-0x2fff] Dec 16 12:11:10.407982 kernel: pci 0000:00:04.6: bridge window [mem 0x10400000-0x105fffff] Dec 16 12:11:10.408082 kernel: pci 0000:00:04.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.408166 kernel: pci 0000:00:04.7: BAR 0 [mem 0x12581000-0x12581fff] Dec 16 12:11:10.408249 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Dec 16 12:11:10.408332 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x1fff] Dec 16 12:11:10.408415 kernel: pci 0000:00:04.7: bridge window [mem 0x10200000-0x103fffff] Dec 16 12:11:10.408505 kernel: pci 0000:00:05.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:11:10.408592 kernel: pci 0000:00:05.0: BAR 0 [mem 0x12580000-0x12580fff] Dec 16 12:11:10.408693 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Dec 16 12:11:10.408779 kernel: pci 0000:00:05.0: bridge window [io 0x0000-0x0fff] Dec 16 12:11:10.408863 kernel: pci 0000:00:05.0: bridge window [mem 0x10000000-0x101fffff] Dec 16 12:11:10.408959 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Dec 16 12:11:10.409046 kernel: pci 0000:01:00.0: BAR 1 [mem 0x12400000-0x12400fff] Dec 16 12:11:10.409136 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Dec 16 12:11:10.409227 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Dec 16 12:11:10.409323 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Dec 16 12:11:10.409415 kernel: pci 0000:02:00.0: BAR 0 [mem 0x12300000-0x12303fff 64bit] Dec 16 12:11:10.409512 kernel: pci 0000:03:00.0: [1af4:1042] type 00 class 0x010000 PCIe Endpoint Dec 16 12:11:10.409607 kernel: pci 0000:03:00.0: BAR 1 [mem 0x12200000-0x12200fff] Dec 16 12:11:10.409735 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Dec 16 12:11:10.409842 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Dec 16 12:11:10.409931 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Dec 16 12:11:10.410027 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Dec 16 12:11:10.410123 kernel: pci 0000:05:00.0: BAR 1 [mem 0x12100000-0x12100fff] Dec 16 12:11:10.410213 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Dec 16 12:11:10.410308 kernel: pci 0000:06:00.0: [1af4:1050] type 00 class 0x038000 PCIe Endpoint Dec 16 12:11:10.410396 kernel: pci 0000:06:00.0: BAR 1 [mem 0x12000000-0x12000fff] Dec 16 12:11:10.410482 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Dec 16 12:11:10.410569 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Dec 16 12:11:10.410671 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Dec 16 12:11:10.410758 kernel: pci 0000:00:01.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Dec 16 12:11:10.410846 kernel: pci 0000:00:01.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Dec 16 12:11:10.410932 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Dec 16 12:11:10.411020 kernel: pci 0000:00:01.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Dec 16 12:11:10.411110 kernel: pci 0000:00:01.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Dec 16 12:11:10.411198 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Dec 16 12:11:10.411303 kernel: pci 0000:00:01.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Dec 16 12:11:10.411393 kernel: pci 0000:00:01.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Dec 16 12:11:10.411481 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Dec 16 12:11:10.411565 kernel: pci 0000:00:01.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Dec 16 12:11:10.411674 kernel: pci 0000:00:01.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Dec 16 12:11:10.411762 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Dec 16 12:11:10.411845 kernel: pci 0000:00:01.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Dec 16 12:11:10.411931 kernel: pci 0000:00:01.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Dec 16 12:11:10.412040 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Dec 16 12:11:10.412126 kernel: pci 0000:00:01.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Dec 16 12:11:10.412218 kernel: pci 0000:00:01.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Dec 16 12:11:10.412303 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 07] add_size 200000 add_align 100000 Dec 16 12:11:10.412392 kernel: pci 0000:00:01.6: bridge window [mem 0x00100000-0x000fffff] to [bus 07] add_size 200000 add_align 100000 Dec 16 12:11:10.412488 kernel: pci 0000:00:01.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Dec 16 12:11:10.412577 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Dec 16 12:11:10.412673 kernel: pci 0000:00:01.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Dec 16 12:11:10.412762 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Dec 16 12:11:10.412853 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Dec 16 12:11:10.412937 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Dec 16 12:11:10.413027 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 0a] add_size 1000 Dec 16 12:11:10.413145 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0a] add_size 200000 add_align 100000 Dec 16 12:11:10.413231 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff] to [bus 0a] add_size 200000 add_align 100000 Dec 16 12:11:10.413322 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 0b] add_size 1000 Dec 16 12:11:10.413407 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0b] add_size 200000 add_align 100000 Dec 16 12:11:10.413492 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x000fffff] to [bus 0b] add_size 200000 add_align 100000 Dec 16 12:11:10.413583 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 0c] add_size 1000 Dec 16 12:11:10.413683 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0c] add_size 200000 add_align 100000 Dec 16 12:11:10.413768 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 0c] add_size 200000 add_align 100000 Dec 16 12:11:10.413855 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 0d] add_size 1000 Dec 16 12:11:10.413938 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0d] add_size 200000 add_align 100000 Dec 16 12:11:10.414022 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x000fffff] to [bus 0d] add_size 200000 add_align 100000 Dec 16 12:11:10.414114 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 0e] add_size 1000 Dec 16 12:11:10.414201 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0e] add_size 200000 add_align 100000 Dec 16 12:11:10.414284 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x000fffff] to [bus 0e] add_size 200000 add_align 100000 Dec 16 12:11:10.414372 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 0f] add_size 1000 Dec 16 12:11:10.414460 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 0f] add_size 200000 add_align 100000 Dec 16 12:11:10.414549 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x000fffff] to [bus 0f] add_size 200000 add_align 100000 Dec 16 12:11:10.414649 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 10] add_size 1000 Dec 16 12:11:10.414736 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 10] add_size 200000 add_align 100000 Dec 16 12:11:10.414821 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 10] add_size 200000 add_align 100000 Dec 16 12:11:10.414909 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 11] add_size 1000 Dec 16 12:11:10.414995 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 11] add_size 200000 add_align 100000 Dec 16 12:11:10.415085 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 11] add_size 200000 add_align 100000 Dec 16 12:11:10.415176 kernel: pci 0000:00:03.1: bridge window [io 0x1000-0x0fff] to [bus 12] add_size 1000 Dec 16 12:11:10.415264 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 12] add_size 200000 add_align 100000 Dec 16 12:11:10.415349 kernel: pci 0000:00:03.1: bridge window [mem 0x00100000-0x000fffff] to [bus 12] add_size 200000 add_align 100000 Dec 16 12:11:10.415436 kernel: pci 0000:00:03.2: bridge window [io 0x1000-0x0fff] to [bus 13] add_size 1000 Dec 16 12:11:10.415524 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 13] add_size 200000 add_align 100000 Dec 16 12:11:10.415621 kernel: pci 0000:00:03.2: bridge window [mem 0x00100000-0x000fffff] to [bus 13] add_size 200000 add_align 100000 Dec 16 12:11:10.415741 kernel: pci 0000:00:03.3: bridge window [io 0x1000-0x0fff] to [bus 14] add_size 1000 Dec 16 12:11:10.415831 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 14] add_size 200000 add_align 100000 Dec 16 12:11:10.415935 kernel: pci 0000:00:03.3: bridge window [mem 0x00100000-0x000fffff] to [bus 14] add_size 200000 add_align 100000 Dec 16 12:11:10.416028 kernel: pci 0000:00:03.4: bridge window [io 0x1000-0x0fff] to [bus 15] add_size 1000 Dec 16 12:11:10.416118 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 15] add_size 200000 add_align 100000 Dec 16 12:11:10.416202 kernel: pci 0000:00:03.4: bridge window [mem 0x00100000-0x000fffff] to [bus 15] add_size 200000 add_align 100000 Dec 16 12:11:10.416288 kernel: pci 0000:00:03.5: bridge window [io 0x1000-0x0fff] to [bus 16] add_size 1000 Dec 16 12:11:10.416372 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 16] add_size 200000 add_align 100000 Dec 16 12:11:10.416455 kernel: pci 0000:00:03.5: bridge window [mem 0x00100000-0x000fffff] to [bus 16] add_size 200000 add_align 100000 Dec 16 12:11:10.416542 kernel: pci 0000:00:03.6: bridge window [io 0x1000-0x0fff] to [bus 17] add_size 1000 Dec 16 12:11:10.416642 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 17] add_size 200000 add_align 100000 Dec 16 12:11:10.416729 kernel: pci 0000:00:03.6: bridge window [mem 0x00100000-0x000fffff] to [bus 17] add_size 200000 add_align 100000 Dec 16 12:11:10.416818 kernel: pci 0000:00:03.7: bridge window [io 0x1000-0x0fff] to [bus 18] add_size 1000 Dec 16 12:11:10.416911 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 18] add_size 200000 add_align 100000 Dec 16 12:11:10.416998 kernel: pci 0000:00:03.7: bridge window [mem 0x00100000-0x000fffff] to [bus 18] add_size 200000 add_align 100000 Dec 16 12:11:10.417090 kernel: pci 0000:00:04.0: bridge window [io 0x1000-0x0fff] to [bus 19] add_size 1000 Dec 16 12:11:10.417175 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 19] add_size 200000 add_align 100000 Dec 16 12:11:10.417261 kernel: pci 0000:00:04.0: bridge window [mem 0x00100000-0x000fffff] to [bus 19] add_size 200000 add_align 100000 Dec 16 12:11:10.417348 kernel: pci 0000:00:04.1: bridge window [io 0x1000-0x0fff] to [bus 1a] add_size 1000 Dec 16 12:11:10.417450 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1a] add_size 200000 add_align 100000 Dec 16 12:11:10.417532 kernel: pci 0000:00:04.1: bridge window [mem 0x00100000-0x000fffff] to [bus 1a] add_size 200000 add_align 100000 Dec 16 12:11:10.417621 kernel: pci 0000:00:04.2: bridge window [io 0x1000-0x0fff] to [bus 1b] add_size 1000 Dec 16 12:11:10.417732 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1b] add_size 200000 add_align 100000 Dec 16 12:11:10.417817 kernel: pci 0000:00:04.2: bridge window [mem 0x00100000-0x000fffff] to [bus 1b] add_size 200000 add_align 100000 Dec 16 12:11:10.417907 kernel: pci 0000:00:04.3: bridge window [io 0x1000-0x0fff] to [bus 1c] add_size 1000 Dec 16 12:11:10.417992 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1c] add_size 200000 add_align 100000 Dec 16 12:11:10.418079 kernel: pci 0000:00:04.3: bridge window [mem 0x00100000-0x000fffff] to [bus 1c] add_size 200000 add_align 100000 Dec 16 12:11:10.418166 kernel: pci 0000:00:04.4: bridge window [io 0x1000-0x0fff] to [bus 1d] add_size 1000 Dec 16 12:11:10.418251 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1d] add_size 200000 add_align 100000 Dec 16 12:11:10.418334 kernel: pci 0000:00:04.4: bridge window [mem 0x00100000-0x000fffff] to [bus 1d] add_size 200000 add_align 100000 Dec 16 12:11:10.418420 kernel: pci 0000:00:04.5: bridge window [io 0x1000-0x0fff] to [bus 1e] add_size 1000 Dec 16 12:11:10.418505 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1e] add_size 200000 add_align 100000 Dec 16 12:11:10.418590 kernel: pci 0000:00:04.5: bridge window [mem 0x00100000-0x000fffff] to [bus 1e] add_size 200000 add_align 100000 Dec 16 12:11:10.418694 kernel: pci 0000:00:04.6: bridge window [io 0x1000-0x0fff] to [bus 1f] add_size 1000 Dec 16 12:11:10.418816 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 1f] add_size 200000 add_align 100000 Dec 16 12:11:10.418905 kernel: pci 0000:00:04.6: bridge window [mem 0x00100000-0x000fffff] to [bus 1f] add_size 200000 add_align 100000 Dec 16 12:11:10.418994 kernel: pci 0000:00:04.7: bridge window [io 0x1000-0x0fff] to [bus 20] add_size 1000 Dec 16 12:11:10.419097 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 20] add_size 200000 add_align 100000 Dec 16 12:11:10.419189 kernel: pci 0000:00:04.7: bridge window [mem 0x00100000-0x000fffff] to [bus 20] add_size 200000 add_align 100000 Dec 16 12:11:10.420246 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x0fff] to [bus 21] add_size 1000 Dec 16 12:11:10.420338 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 21] add_size 200000 add_align 100000 Dec 16 12:11:10.420423 kernel: pci 0000:00:05.0: bridge window [mem 0x00100000-0x000fffff] to [bus 21] add_size 200000 add_align 100000 Dec 16 12:11:10.420509 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff]: assigned Dec 16 12:11:10.420598 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Dec 16 12:11:10.420703 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff]: assigned Dec 16 12:11:10.420790 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Dec 16 12:11:10.420878 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff]: assigned Dec 16 12:11:10.420964 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Dec 16 12:11:10.421051 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff]: assigned Dec 16 12:11:10.421136 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Dec 16 12:11:10.421227 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff]: assigned Dec 16 12:11:10.421311 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Dec 16 12:11:10.421396 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Dec 16 12:11:10.421480 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Dec 16 12:11:10.421570 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Dec 16 12:11:10.421668 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Dec 16 12:11:10.421760 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Dec 16 12:11:10.421843 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Dec 16 12:11:10.421931 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff]: assigned Dec 16 12:11:10.422023 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Dec 16 12:11:10.422111 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff]: assigned Dec 16 12:11:10.422198 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref]: assigned Dec 16 12:11:10.422286 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff]: assigned Dec 16 12:11:10.422371 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref]: assigned Dec 16 12:11:10.422457 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff]: assigned Dec 16 12:11:10.422542 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref]: assigned Dec 16 12:11:10.422655 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff]: assigned Dec 16 12:11:10.422754 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref]: assigned Dec 16 12:11:10.422844 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff]: assigned Dec 16 12:11:10.422934 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref]: assigned Dec 16 12:11:10.424736 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff]: assigned Dec 16 12:11:10.424826 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref]: assigned Dec 16 12:11:10.424913 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff]: assigned Dec 16 12:11:10.424997 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref]: assigned Dec 16 12:11:10.425082 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff]: assigned Dec 16 12:11:10.425171 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref]: assigned Dec 16 12:11:10.425257 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff]: assigned Dec 16 12:11:10.425341 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref]: assigned Dec 16 12:11:10.425428 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff]: assigned Dec 16 12:11:10.425519 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref]: assigned Dec 16 12:11:10.425608 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff]: assigned Dec 16 12:11:10.425712 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref]: assigned Dec 16 12:11:10.425801 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff]: assigned Dec 16 12:11:10.425887 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref]: assigned Dec 16 12:11:10.425976 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff]: assigned Dec 16 12:11:10.426060 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref]: assigned Dec 16 12:11:10.426146 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff]: assigned Dec 16 12:11:10.426235 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref]: assigned Dec 16 12:11:10.426326 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff]: assigned Dec 16 12:11:10.426410 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref]: assigned Dec 16 12:11:10.427182 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff]: assigned Dec 16 12:11:10.427289 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref]: assigned Dec 16 12:11:10.427378 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff]: assigned Dec 16 12:11:10.427464 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref]: assigned Dec 16 12:11:10.427556 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff]: assigned Dec 16 12:11:10.427686 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref]: assigned Dec 16 12:11:10.427781 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff]: assigned Dec 16 12:11:10.427868 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref]: assigned Dec 16 12:11:10.427978 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff]: assigned Dec 16 12:11:10.428069 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref]: assigned Dec 16 12:11:10.428162 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff]: assigned Dec 16 12:11:10.428255 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref]: assigned Dec 16 12:11:10.428344 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff]: assigned Dec 16 12:11:10.428432 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref]: assigned Dec 16 12:11:10.428522 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff]: assigned Dec 16 12:11:10.428616 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref]: assigned Dec 16 12:11:10.428731 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff]: assigned Dec 16 12:11:10.428827 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref]: assigned Dec 16 12:11:10.428918 kernel: pci 0000:00:01.0: BAR 0 [mem 0x14200000-0x14200fff]: assigned Dec 16 12:11:10.429024 kernel: pci 0000:00:01.0: bridge window [io 0x1000-0x1fff]: assigned Dec 16 12:11:10.429113 kernel: pci 0000:00:01.1: BAR 0 [mem 0x14201000-0x14201fff]: assigned Dec 16 12:11:10.429201 kernel: pci 0000:00:01.1: bridge window [io 0x2000-0x2fff]: assigned Dec 16 12:11:10.429288 kernel: pci 0000:00:01.2: BAR 0 [mem 0x14202000-0x14202fff]: assigned Dec 16 12:11:10.429376 kernel: pci 0000:00:01.2: bridge window [io 0x3000-0x3fff]: assigned Dec 16 12:11:10.429464 kernel: pci 0000:00:01.3: BAR 0 [mem 0x14203000-0x14203fff]: assigned Dec 16 12:11:10.429549 kernel: pci 0000:00:01.3: bridge window [io 0x4000-0x4fff]: assigned Dec 16 12:11:10.429649 kernel: pci 0000:00:01.4: BAR 0 [mem 0x14204000-0x14204fff]: assigned Dec 16 12:11:10.429738 kernel: pci 0000:00:01.4: bridge window [io 0x5000-0x5fff]: assigned Dec 16 12:11:10.429825 kernel: pci 0000:00:01.5: BAR 0 [mem 0x14205000-0x14205fff]: assigned Dec 16 12:11:10.429911 kernel: pci 0000:00:01.5: bridge window [io 0x6000-0x6fff]: assigned Dec 16 12:11:10.430013 kernel: pci 0000:00:01.6: BAR 0 [mem 0x14206000-0x14206fff]: assigned Dec 16 12:11:10.430103 kernel: pci 0000:00:01.6: bridge window [io 0x7000-0x7fff]: assigned Dec 16 12:11:10.430193 kernel: pci 0000:00:01.7: BAR 0 [mem 0x14207000-0x14207fff]: assigned Dec 16 12:11:10.430300 kernel: pci 0000:00:01.7: bridge window [io 0x8000-0x8fff]: assigned Dec 16 12:11:10.430390 kernel: pci 0000:00:02.0: BAR 0 [mem 0x14208000-0x14208fff]: assigned Dec 16 12:11:10.430477 kernel: pci 0000:00:02.0: bridge window [io 0x9000-0x9fff]: assigned Dec 16 12:11:10.430565 kernel: pci 0000:00:02.1: BAR 0 [mem 0x14209000-0x14209fff]: assigned Dec 16 12:11:10.430661 kernel: pci 0000:00:02.1: bridge window [io 0xa000-0xafff]: assigned Dec 16 12:11:10.430750 kernel: pci 0000:00:02.2: BAR 0 [mem 0x1420a000-0x1420afff]: assigned Dec 16 12:11:10.430835 kernel: pci 0000:00:02.2: bridge window [io 0xb000-0xbfff]: assigned Dec 16 12:11:10.430921 kernel: pci 0000:00:02.3: BAR 0 [mem 0x1420b000-0x1420bfff]: assigned Dec 16 12:11:10.431011 kernel: pci 0000:00:02.3: bridge window [io 0xc000-0xcfff]: assigned Dec 16 12:11:10.431098 kernel: pci 0000:00:02.4: BAR 0 [mem 0x1420c000-0x1420cfff]: assigned Dec 16 12:11:10.431188 kernel: pci 0000:00:02.4: bridge window [io 0xd000-0xdfff]: assigned Dec 16 12:11:10.431275 kernel: pci 0000:00:02.5: BAR 0 [mem 0x1420d000-0x1420dfff]: assigned Dec 16 12:11:10.431361 kernel: pci 0000:00:02.5: bridge window [io 0xe000-0xefff]: assigned Dec 16 12:11:10.431450 kernel: pci 0000:00:02.6: BAR 0 [mem 0x1420e000-0x1420efff]: assigned Dec 16 12:11:10.431540 kernel: pci 0000:00:02.6: bridge window [io 0xf000-0xffff]: assigned Dec 16 12:11:10.431735 kernel: pci 0000:00:02.7: BAR 0 [mem 0x1420f000-0x1420ffff]: assigned Dec 16 12:11:10.431844 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.431933 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.432023 kernel: pci 0000:00:03.0: BAR 0 [mem 0x14210000-0x14210fff]: assigned Dec 16 12:11:10.432109 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.432204 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.432293 kernel: pci 0000:00:03.1: BAR 0 [mem 0x14211000-0x14211fff]: assigned Dec 16 12:11:10.432379 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.432470 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.432559 kernel: pci 0000:00:03.2: BAR 0 [mem 0x14212000-0x14212fff]: assigned Dec 16 12:11:10.432692 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.432791 kernel: pci 0000:00:03.2: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.432882 kernel: pci 0000:00:03.3: BAR 0 [mem 0x14213000-0x14213fff]: assigned Dec 16 12:11:10.432969 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.433056 kernel: pci 0000:00:03.3: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.433144 kernel: pci 0000:00:03.4: BAR 0 [mem 0x14214000-0x14214fff]: assigned Dec 16 12:11:10.433232 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.433340 kernel: pci 0000:00:03.4: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.433436 kernel: pci 0000:00:03.5: BAR 0 [mem 0x14215000-0x14215fff]: assigned Dec 16 12:11:10.433521 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.433604 kernel: pci 0000:00:03.5: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.433713 kernel: pci 0000:00:03.6: BAR 0 [mem 0x14216000-0x14216fff]: assigned Dec 16 12:11:10.433802 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.433888 kernel: pci 0000:00:03.6: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.433980 kernel: pci 0000:00:03.7: BAR 0 [mem 0x14217000-0x14217fff]: assigned Dec 16 12:11:10.434065 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.434149 kernel: pci 0000:00:03.7: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.434235 kernel: pci 0000:00:04.0: BAR 0 [mem 0x14218000-0x14218fff]: assigned Dec 16 12:11:10.434320 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.434405 kernel: pci 0000:00:04.0: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.434492 kernel: pci 0000:00:04.1: BAR 0 [mem 0x14219000-0x14219fff]: assigned Dec 16 12:11:10.434582 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.434678 kernel: pci 0000:00:04.1: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.434781 kernel: pci 0000:00:04.2: BAR 0 [mem 0x1421a000-0x1421afff]: assigned Dec 16 12:11:10.434868 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.434953 kernel: pci 0000:00:04.2: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.435042 kernel: pci 0000:00:04.3: BAR 0 [mem 0x1421b000-0x1421bfff]: assigned Dec 16 12:11:10.435132 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.435215 kernel: pci 0000:00:04.3: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.435300 kernel: pci 0000:00:04.4: BAR 0 [mem 0x1421c000-0x1421cfff]: assigned Dec 16 12:11:10.435383 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.435465 kernel: pci 0000:00:04.4: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.435551 kernel: pci 0000:00:04.5: BAR 0 [mem 0x1421d000-0x1421dfff]: assigned Dec 16 12:11:10.435668 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.435760 kernel: pci 0000:00:04.5: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.435850 kernel: pci 0000:00:04.6: BAR 0 [mem 0x1421e000-0x1421efff]: assigned Dec 16 12:11:10.435938 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.436043 kernel: pci 0000:00:04.6: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.436132 kernel: pci 0000:00:04.7: BAR 0 [mem 0x1421f000-0x1421ffff]: assigned Dec 16 12:11:10.436216 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.436305 kernel: pci 0000:00:04.7: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.436390 kernel: pci 0000:00:05.0: BAR 0 [mem 0x14220000-0x14220fff]: assigned Dec 16 12:11:10.436475 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.436557 kernel: pci 0000:00:05.0: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.436655 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff]: assigned Dec 16 12:11:10.436744 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff]: assigned Dec 16 12:11:10.436833 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff]: assigned Dec 16 12:11:10.436918 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff]: assigned Dec 16 12:11:10.437002 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff]: assigned Dec 16 12:11:10.437088 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff]: assigned Dec 16 12:11:10.437173 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff]: assigned Dec 16 12:11:10.437258 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff]: assigned Dec 16 12:11:10.437341 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff]: assigned Dec 16 12:11:10.437429 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff]: assigned Dec 16 12:11:10.437514 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff]: assigned Dec 16 12:11:10.437598 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff]: assigned Dec 16 12:11:10.437695 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff]: assigned Dec 16 12:11:10.437786 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff]: assigned Dec 16 12:11:10.437871 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff]: assigned Dec 16 12:11:10.437958 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.438057 kernel: pci 0000:00:03.1: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.438146 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.438230 kernel: pci 0000:00:03.0: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.438315 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.438398 kernel: pci 0000:00:02.7: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.438483 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.438566 kernel: pci 0000:00:02.6: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.438664 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.438755 kernel: pci 0000:00:02.5: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.438841 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.438925 kernel: pci 0000:00:02.4: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.439011 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.439095 kernel: pci 0000:00:02.3: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.439183 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.439270 kernel: pci 0000:00:02.2: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.439355 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.439452 kernel: pci 0000:00:02.1: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.439542 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.439653 kernel: pci 0000:00:02.0: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.439745 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.439834 kernel: pci 0000:00:01.7: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.439922 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.440006 kernel: pci 0000:00:01.6: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.440092 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.440175 kernel: pci 0000:00:01.5: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.440261 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.440347 kernel: pci 0000:00:01.4: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.440433 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.440517 kernel: pci 0000:00:01.3: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.440603 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.440700 kernel: pci 0000:00:01.2: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.440790 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.440875 kernel: pci 0000:00:01.1: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.440960 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: can't assign; no space Dec 16 12:11:10.441043 kernel: pci 0000:00:01.0: bridge window [io size 0x1000]: failed to assign Dec 16 12:11:10.441135 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Dec 16 12:11:10.441221 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Dec 16 12:11:10.441309 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Dec 16 12:11:10.441393 kernel: pci 0000:00:01.0: PCI bridge to [bus 01] Dec 16 12:11:10.441477 kernel: pci 0000:00:01.0: bridge window [mem 0x10000000-0x101fffff] Dec 16 12:11:10.441560 kernel: pci 0000:00:01.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Dec 16 12:11:10.441660 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Dec 16 12:11:10.441746 kernel: pci 0000:00:01.1: PCI bridge to [bus 02] Dec 16 12:11:10.441833 kernel: pci 0000:00:01.1: bridge window [mem 0x10200000-0x103fffff] Dec 16 12:11:10.441920 kernel: pci 0000:00:01.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Dec 16 12:11:10.442011 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Dec 16 12:11:10.442096 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Dec 16 12:11:10.442180 kernel: pci 0000:00:01.2: PCI bridge to [bus 03] Dec 16 12:11:10.442266 kernel: pci 0000:00:01.2: bridge window [mem 0x10400000-0x105fffff] Dec 16 12:11:10.442351 kernel: pci 0000:00:01.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Dec 16 12:11:10.442440 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Dec 16 12:11:10.442524 kernel: pci 0000:00:01.3: PCI bridge to [bus 04] Dec 16 12:11:10.442608 kernel: pci 0000:00:01.3: bridge window [mem 0x10600000-0x107fffff] Dec 16 12:11:10.442701 kernel: pci 0000:00:01.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Dec 16 12:11:10.442791 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Dec 16 12:11:10.442880 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Dec 16 12:11:10.442964 kernel: pci 0000:00:01.4: PCI bridge to [bus 05] Dec 16 12:11:10.443048 kernel: pci 0000:00:01.4: bridge window [mem 0x10800000-0x109fffff] Dec 16 12:11:10.443131 kernel: pci 0000:00:01.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Dec 16 12:11:10.443221 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Dec 16 12:11:10.443310 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Dec 16 12:11:10.443395 kernel: pci 0000:00:01.5: PCI bridge to [bus 06] Dec 16 12:11:10.443478 kernel: pci 0000:00:01.5: bridge window [mem 0x10a00000-0x10bfffff] Dec 16 12:11:10.443562 kernel: pci 0000:00:01.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Dec 16 12:11:10.443669 kernel: pci 0000:00:01.6: PCI bridge to [bus 07] Dec 16 12:11:10.443756 kernel: pci 0000:00:01.6: bridge window [mem 0x10c00000-0x10dfffff] Dec 16 12:11:10.443840 kernel: pci 0000:00:01.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Dec 16 12:11:10.443927 kernel: pci 0000:00:01.7: PCI bridge to [bus 08] Dec 16 12:11:10.444010 kernel: pci 0000:00:01.7: bridge window [mem 0x10e00000-0x10ffffff] Dec 16 12:11:10.444115 kernel: pci 0000:00:01.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Dec 16 12:11:10.444202 kernel: pci 0000:00:02.0: PCI bridge to [bus 09] Dec 16 12:11:10.444286 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Dec 16 12:11:10.444369 kernel: pci 0000:00:02.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Dec 16 12:11:10.444456 kernel: pci 0000:00:02.1: PCI bridge to [bus 0a] Dec 16 12:11:10.444540 kernel: pci 0000:00:02.1: bridge window [mem 0x11200000-0x113fffff] Dec 16 12:11:10.444623 kernel: pci 0000:00:02.1: bridge window [mem 0x8001200000-0x80013fffff 64bit pref] Dec 16 12:11:10.444726 kernel: pci 0000:00:02.2: PCI bridge to [bus 0b] Dec 16 12:11:10.444812 kernel: pci 0000:00:02.2: bridge window [mem 0x11400000-0x115fffff] Dec 16 12:11:10.444899 kernel: pci 0000:00:02.2: bridge window [mem 0x8001400000-0x80015fffff 64bit pref] Dec 16 12:11:10.444984 kernel: pci 0000:00:02.3: PCI bridge to [bus 0c] Dec 16 12:11:10.445068 kernel: pci 0000:00:02.3: bridge window [mem 0x11600000-0x117fffff] Dec 16 12:11:10.445176 kernel: pci 0000:00:02.3: bridge window [mem 0x8001600000-0x80017fffff 64bit pref] Dec 16 12:11:10.445265 kernel: pci 0000:00:02.4: PCI bridge to [bus 0d] Dec 16 12:11:10.445349 kernel: pci 0000:00:02.4: bridge window [mem 0x11800000-0x119fffff] Dec 16 12:11:10.445436 kernel: pci 0000:00:02.4: bridge window [mem 0x8001800000-0x80019fffff 64bit pref] Dec 16 12:11:10.445522 kernel: pci 0000:00:02.5: PCI bridge to [bus 0e] Dec 16 12:11:10.445606 kernel: pci 0000:00:02.5: bridge window [mem 0x11a00000-0x11bfffff] Dec 16 12:11:10.445701 kernel: pci 0000:00:02.5: bridge window [mem 0x8001a00000-0x8001bfffff 64bit pref] Dec 16 12:11:10.445788 kernel: pci 0000:00:02.6: PCI bridge to [bus 0f] Dec 16 12:11:10.445874 kernel: pci 0000:00:02.6: bridge window [mem 0x11c00000-0x11dfffff] Dec 16 12:11:10.445958 kernel: pci 0000:00:02.6: bridge window [mem 0x8001c00000-0x8001dfffff 64bit pref] Dec 16 12:11:10.446043 kernel: pci 0000:00:02.7: PCI bridge to [bus 10] Dec 16 12:11:10.446128 kernel: pci 0000:00:02.7: bridge window [mem 0x11e00000-0x11ffffff] Dec 16 12:11:10.446211 kernel: pci 0000:00:02.7: bridge window [mem 0x8001e00000-0x8001ffffff 64bit pref] Dec 16 12:11:10.446297 kernel: pci 0000:00:03.0: PCI bridge to [bus 11] Dec 16 12:11:10.446381 kernel: pci 0000:00:03.0: bridge window [mem 0x12000000-0x121fffff] Dec 16 12:11:10.446464 kernel: pci 0000:00:03.0: bridge window [mem 0x8002000000-0x80021fffff 64bit pref] Dec 16 12:11:10.446550 kernel: pci 0000:00:03.1: PCI bridge to [bus 12] Dec 16 12:11:10.446643 kernel: pci 0000:00:03.1: bridge window [mem 0x12200000-0x123fffff] Dec 16 12:11:10.446730 kernel: pci 0000:00:03.1: bridge window [mem 0x8002200000-0x80023fffff 64bit pref] Dec 16 12:11:10.446819 kernel: pci 0000:00:03.2: PCI bridge to [bus 13] Dec 16 12:11:10.446904 kernel: pci 0000:00:03.2: bridge window [io 0xf000-0xffff] Dec 16 12:11:10.446989 kernel: pci 0000:00:03.2: bridge window [mem 0x12400000-0x125fffff] Dec 16 12:11:10.447075 kernel: pci 0000:00:03.2: bridge window [mem 0x8002400000-0x80025fffff 64bit pref] Dec 16 12:11:10.447163 kernel: pci 0000:00:03.3: PCI bridge to [bus 14] Dec 16 12:11:10.447246 kernel: pci 0000:00:03.3: bridge window [io 0xe000-0xefff] Dec 16 12:11:10.447332 kernel: pci 0000:00:03.3: bridge window [mem 0x12600000-0x127fffff] Dec 16 12:11:10.447415 kernel: pci 0000:00:03.3: bridge window [mem 0x8002600000-0x80027fffff 64bit pref] Dec 16 12:11:10.447500 kernel: pci 0000:00:03.4: PCI bridge to [bus 15] Dec 16 12:11:10.447584 kernel: pci 0000:00:03.4: bridge window [io 0xd000-0xdfff] Dec 16 12:11:10.447696 kernel: pci 0000:00:03.4: bridge window [mem 0x12800000-0x129fffff] Dec 16 12:11:10.447783 kernel: pci 0000:00:03.4: bridge window [mem 0x8002800000-0x80029fffff 64bit pref] Dec 16 12:11:10.447870 kernel: pci 0000:00:03.5: PCI bridge to [bus 16] Dec 16 12:11:10.447958 kernel: pci 0000:00:03.5: bridge window [io 0xc000-0xcfff] Dec 16 12:11:10.448043 kernel: pci 0000:00:03.5: bridge window [mem 0x12a00000-0x12bfffff] Dec 16 12:11:10.448129 kernel: pci 0000:00:03.5: bridge window [mem 0x8002a00000-0x8002bfffff 64bit pref] Dec 16 12:11:10.448217 kernel: pci 0000:00:03.6: PCI bridge to [bus 17] Dec 16 12:11:10.448302 kernel: pci 0000:00:03.6: bridge window [io 0xb000-0xbfff] Dec 16 12:11:10.448387 kernel: pci 0000:00:03.6: bridge window [mem 0x12c00000-0x12dfffff] Dec 16 12:11:10.448474 kernel: pci 0000:00:03.6: bridge window [mem 0x8002c00000-0x8002dfffff 64bit pref] Dec 16 12:11:10.448566 kernel: pci 0000:00:03.7: PCI bridge to [bus 18] Dec 16 12:11:10.448662 kernel: pci 0000:00:03.7: bridge window [io 0xa000-0xafff] Dec 16 12:11:10.448784 kernel: pci 0000:00:03.7: bridge window [mem 0x12e00000-0x12ffffff] Dec 16 12:11:10.448871 kernel: pci 0000:00:03.7: bridge window [mem 0x8002e00000-0x8002ffffff 64bit pref] Dec 16 12:11:10.448957 kernel: pci 0000:00:04.0: PCI bridge to [bus 19] Dec 16 12:11:10.449041 kernel: pci 0000:00:04.0: bridge window [io 0x9000-0x9fff] Dec 16 12:11:10.449127 kernel: pci 0000:00:04.0: bridge window [mem 0x13000000-0x131fffff] Dec 16 12:11:10.449210 kernel: pci 0000:00:04.0: bridge window [mem 0x8003000000-0x80031fffff 64bit pref] Dec 16 12:11:10.449295 kernel: pci 0000:00:04.1: PCI bridge to [bus 1a] Dec 16 12:11:10.449379 kernel: pci 0000:00:04.1: bridge window [io 0x8000-0x8fff] Dec 16 12:11:10.449462 kernel: pci 0000:00:04.1: bridge window [mem 0x13200000-0x133fffff] Dec 16 12:11:10.449544 kernel: pci 0000:00:04.1: bridge window [mem 0x8003200000-0x80033fffff 64bit pref] Dec 16 12:11:10.449644 kernel: pci 0000:00:04.2: PCI bridge to [bus 1b] Dec 16 12:11:10.449741 kernel: pci 0000:00:04.2: bridge window [io 0x7000-0x7fff] Dec 16 12:11:10.449829 kernel: pci 0000:00:04.2: bridge window [mem 0x13400000-0x135fffff] Dec 16 12:11:10.449916 kernel: pci 0000:00:04.2: bridge window [mem 0x8003400000-0x80035fffff 64bit pref] Dec 16 12:11:10.450004 kernel: pci 0000:00:04.3: PCI bridge to [bus 1c] Dec 16 12:11:10.450087 kernel: pci 0000:00:04.3: bridge window [io 0x6000-0x6fff] Dec 16 12:11:10.450170 kernel: pci 0000:00:04.3: bridge window [mem 0x13600000-0x137fffff] Dec 16 12:11:10.450253 kernel: pci 0000:00:04.3: bridge window [mem 0x8003600000-0x80037fffff 64bit pref] Dec 16 12:11:10.450362 kernel: pci 0000:00:04.4: PCI bridge to [bus 1d] Dec 16 12:11:10.450470 kernel: pci 0000:00:04.4: bridge window [io 0x5000-0x5fff] Dec 16 12:11:10.450556 kernel: pci 0000:00:04.4: bridge window [mem 0x13800000-0x139fffff] Dec 16 12:11:10.450654 kernel: pci 0000:00:04.4: bridge window [mem 0x8003800000-0x80039fffff 64bit pref] Dec 16 12:11:10.450743 kernel: pci 0000:00:04.5: PCI bridge to [bus 1e] Dec 16 12:11:10.450827 kernel: pci 0000:00:04.5: bridge window [io 0x4000-0x4fff] Dec 16 12:11:10.450914 kernel: pci 0000:00:04.5: bridge window [mem 0x13a00000-0x13bfffff] Dec 16 12:11:10.451006 kernel: pci 0000:00:04.5: bridge window [mem 0x8003a00000-0x8003bfffff 64bit pref] Dec 16 12:11:10.451096 kernel: pci 0000:00:04.6: PCI bridge to [bus 1f] Dec 16 12:11:10.451181 kernel: pci 0000:00:04.6: bridge window [io 0x3000-0x3fff] Dec 16 12:11:10.451267 kernel: pci 0000:00:04.6: bridge window [mem 0x13c00000-0x13dfffff] Dec 16 12:11:10.451350 kernel: pci 0000:00:04.6: bridge window [mem 0x8003c00000-0x8003dfffff 64bit pref] Dec 16 12:11:10.451437 kernel: pci 0000:00:04.7: PCI bridge to [bus 20] Dec 16 12:11:10.451525 kernel: pci 0000:00:04.7: bridge window [io 0x2000-0x2fff] Dec 16 12:11:10.451623 kernel: pci 0000:00:04.7: bridge window [mem 0x13e00000-0x13ffffff] Dec 16 12:11:10.451740 kernel: pci 0000:00:04.7: bridge window [mem 0x8003e00000-0x8003ffffff 64bit pref] Dec 16 12:11:10.451830 kernel: pci 0000:00:05.0: PCI bridge to [bus 21] Dec 16 12:11:10.451916 kernel: pci 0000:00:05.0: bridge window [io 0x1000-0x1fff] Dec 16 12:11:10.452001 kernel: pci 0000:00:05.0: bridge window [mem 0x14000000-0x141fffff] Dec 16 12:11:10.452085 kernel: pci 0000:00:05.0: bridge window [mem 0x8004000000-0x80041fffff 64bit pref] Dec 16 12:11:10.452174 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Dec 16 12:11:10.452252 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Dec 16 12:11:10.452330 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Dec 16 12:11:10.452423 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Dec 16 12:11:10.452506 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Dec 16 12:11:10.452601 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Dec 16 12:11:10.452700 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Dec 16 12:11:10.452789 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Dec 16 12:11:10.452867 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Dec 16 12:11:10.452951 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Dec 16 12:11:10.453029 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Dec 16 12:11:10.453115 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Dec 16 12:11:10.453193 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Dec 16 12:11:10.453277 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Dec 16 12:11:10.453355 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Dec 16 12:11:10.453442 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Dec 16 12:11:10.453527 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Dec 16 12:11:10.453641 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Dec 16 12:11:10.453739 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Dec 16 12:11:10.453825 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Dec 16 12:11:10.453903 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Dec 16 12:11:10.453990 kernel: pci_bus 0000:0a: resource 1 [mem 0x11200000-0x113fffff] Dec 16 12:11:10.454068 kernel: pci_bus 0000:0a: resource 2 [mem 0x8001200000-0x80013fffff 64bit pref] Dec 16 12:11:10.454159 kernel: pci_bus 0000:0b: resource 1 [mem 0x11400000-0x115fffff] Dec 16 12:11:10.454237 kernel: pci_bus 0000:0b: resource 2 [mem 0x8001400000-0x80015fffff 64bit pref] Dec 16 12:11:10.454321 kernel: pci_bus 0000:0c: resource 1 [mem 0x11600000-0x117fffff] Dec 16 12:11:10.454403 kernel: pci_bus 0000:0c: resource 2 [mem 0x8001600000-0x80017fffff 64bit pref] Dec 16 12:11:10.454495 kernel: pci_bus 0000:0d: resource 1 [mem 0x11800000-0x119fffff] Dec 16 12:11:10.454577 kernel: pci_bus 0000:0d: resource 2 [mem 0x8001800000-0x80019fffff 64bit pref] Dec 16 12:11:10.454681 kernel: pci_bus 0000:0e: resource 1 [mem 0x11a00000-0x11bfffff] Dec 16 12:11:10.454763 kernel: pci_bus 0000:0e: resource 2 [mem 0x8001a00000-0x8001bfffff 64bit pref] Dec 16 12:11:10.454854 kernel: pci_bus 0000:0f: resource 1 [mem 0x11c00000-0x11dfffff] Dec 16 12:11:10.454935 kernel: pci_bus 0000:0f: resource 2 [mem 0x8001c00000-0x8001dfffff 64bit pref] Dec 16 12:11:10.455019 kernel: pci_bus 0000:10: resource 1 [mem 0x11e00000-0x11ffffff] Dec 16 12:11:10.455115 kernel: pci_bus 0000:10: resource 2 [mem 0x8001e00000-0x8001ffffff 64bit pref] Dec 16 12:11:10.455200 kernel: pci_bus 0000:11: resource 1 [mem 0x12000000-0x121fffff] Dec 16 12:11:10.455278 kernel: pci_bus 0000:11: resource 2 [mem 0x8002000000-0x80021fffff 64bit pref] Dec 16 12:11:10.455366 kernel: pci_bus 0000:12: resource 1 [mem 0x12200000-0x123fffff] Dec 16 12:11:10.455444 kernel: pci_bus 0000:12: resource 2 [mem 0x8002200000-0x80023fffff 64bit pref] Dec 16 12:11:10.455533 kernel: pci_bus 0000:13: resource 0 [io 0xf000-0xffff] Dec 16 12:11:10.455635 kernel: pci_bus 0000:13: resource 1 [mem 0x12400000-0x125fffff] Dec 16 12:11:10.455728 kernel: pci_bus 0000:13: resource 2 [mem 0x8002400000-0x80025fffff 64bit pref] Dec 16 12:11:10.455815 kernel: pci_bus 0000:14: resource 0 [io 0xe000-0xefff] Dec 16 12:11:10.455893 kernel: pci_bus 0000:14: resource 1 [mem 0x12600000-0x127fffff] Dec 16 12:11:10.455971 kernel: pci_bus 0000:14: resource 2 [mem 0x8002600000-0x80027fffff 64bit pref] Dec 16 12:11:10.456054 kernel: pci_bus 0000:15: resource 0 [io 0xd000-0xdfff] Dec 16 12:11:10.456133 kernel: pci_bus 0000:15: resource 1 [mem 0x12800000-0x129fffff] Dec 16 12:11:10.456212 kernel: pci_bus 0000:15: resource 2 [mem 0x8002800000-0x80029fffff 64bit pref] Dec 16 12:11:10.456299 kernel: pci_bus 0000:16: resource 0 [io 0xc000-0xcfff] Dec 16 12:11:10.456377 kernel: pci_bus 0000:16: resource 1 [mem 0x12a00000-0x12bfffff] Dec 16 12:11:10.456454 kernel: pci_bus 0000:16: resource 2 [mem 0x8002a00000-0x8002bfffff 64bit pref] Dec 16 12:11:10.456538 kernel: pci_bus 0000:17: resource 0 [io 0xb000-0xbfff] Dec 16 12:11:10.456620 kernel: pci_bus 0000:17: resource 1 [mem 0x12c00000-0x12dfffff] Dec 16 12:11:10.456715 kernel: pci_bus 0000:17: resource 2 [mem 0x8002c00000-0x8002dfffff 64bit pref] Dec 16 12:11:10.456800 kernel: pci_bus 0000:18: resource 0 [io 0xa000-0xafff] Dec 16 12:11:10.456879 kernel: pci_bus 0000:18: resource 1 [mem 0x12e00000-0x12ffffff] Dec 16 12:11:10.456956 kernel: pci_bus 0000:18: resource 2 [mem 0x8002e00000-0x8002ffffff 64bit pref] Dec 16 12:11:10.457039 kernel: pci_bus 0000:19: resource 0 [io 0x9000-0x9fff] Dec 16 12:11:10.457122 kernel: pci_bus 0000:19: resource 1 [mem 0x13000000-0x131fffff] Dec 16 12:11:10.457199 kernel: pci_bus 0000:19: resource 2 [mem 0x8003000000-0x80031fffff 64bit pref] Dec 16 12:11:10.457284 kernel: pci_bus 0000:1a: resource 0 [io 0x8000-0x8fff] Dec 16 12:11:10.457362 kernel: pci_bus 0000:1a: resource 1 [mem 0x13200000-0x133fffff] Dec 16 12:11:10.457439 kernel: pci_bus 0000:1a: resource 2 [mem 0x8003200000-0x80033fffff 64bit pref] Dec 16 12:11:10.457525 kernel: pci_bus 0000:1b: resource 0 [io 0x7000-0x7fff] Dec 16 12:11:10.457606 kernel: pci_bus 0000:1b: resource 1 [mem 0x13400000-0x135fffff] Dec 16 12:11:10.457696 kernel: pci_bus 0000:1b: resource 2 [mem 0x8003400000-0x80035fffff 64bit pref] Dec 16 12:11:10.457787 kernel: pci_bus 0000:1c: resource 0 [io 0x6000-0x6fff] Dec 16 12:11:10.457867 kernel: pci_bus 0000:1c: resource 1 [mem 0x13600000-0x137fffff] Dec 16 12:11:10.457944 kernel: pci_bus 0000:1c: resource 2 [mem 0x8003600000-0x80037fffff 64bit pref] Dec 16 12:11:10.458033 kernel: pci_bus 0000:1d: resource 0 [io 0x5000-0x5fff] Dec 16 12:11:10.458111 kernel: pci_bus 0000:1d: resource 1 [mem 0x13800000-0x139fffff] Dec 16 12:11:10.458188 kernel: pci_bus 0000:1d: resource 2 [mem 0x8003800000-0x80039fffff 64bit pref] Dec 16 12:11:10.458272 kernel: pci_bus 0000:1e: resource 0 [io 0x4000-0x4fff] Dec 16 12:11:10.458350 kernel: pci_bus 0000:1e: resource 1 [mem 0x13a00000-0x13bfffff] Dec 16 12:11:10.458428 kernel: pci_bus 0000:1e: resource 2 [mem 0x8003a00000-0x8003bfffff 64bit pref] Dec 16 12:11:10.458515 kernel: pci_bus 0000:1f: resource 0 [io 0x3000-0x3fff] Dec 16 12:11:10.458594 kernel: pci_bus 0000:1f: resource 1 [mem 0x13c00000-0x13dfffff] Dec 16 12:11:10.458696 kernel: pci_bus 0000:1f: resource 2 [mem 0x8003c00000-0x8003dfffff 64bit pref] Dec 16 12:11:10.458783 kernel: pci_bus 0000:20: resource 0 [io 0x2000-0x2fff] Dec 16 12:11:10.458861 kernel: pci_bus 0000:20: resource 1 [mem 0x13e00000-0x13ffffff] Dec 16 12:11:10.458939 kernel: pci_bus 0000:20: resource 2 [mem 0x8003e00000-0x8003ffffff 64bit pref] Dec 16 12:11:10.459026 kernel: pci_bus 0000:21: resource 0 [io 0x1000-0x1fff] Dec 16 12:11:10.459104 kernel: pci_bus 0000:21: resource 1 [mem 0x14000000-0x141fffff] Dec 16 12:11:10.459182 kernel: pci_bus 0000:21: resource 2 [mem 0x8004000000-0x80041fffff 64bit pref] Dec 16 12:11:10.459192 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Dec 16 12:11:10.459201 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Dec 16 12:11:10.459210 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Dec 16 12:11:10.459220 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Dec 16 12:11:10.459229 kernel: iommu: Default domain type: Translated Dec 16 12:11:10.459238 kernel: iommu: DMA domain TLB invalidation policy: strict mode Dec 16 12:11:10.459247 kernel: efivars: Registered efivars operations Dec 16 12:11:10.459255 kernel: vgaarb: loaded Dec 16 12:11:10.459263 kernel: clocksource: Switched to clocksource arch_sys_counter Dec 16 12:11:10.459272 kernel: VFS: Disk quotas dquot_6.6.0 Dec 16 12:11:10.459282 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Dec 16 12:11:10.459290 kernel: pnp: PnP ACPI init Dec 16 12:11:10.459385 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Dec 16 12:11:10.459398 kernel: pnp: PnP ACPI: found 1 devices Dec 16 12:11:10.459406 kernel: NET: Registered PF_INET protocol family Dec 16 12:11:10.459415 kernel: IP idents hash table entries: 262144 (order: 9, 2097152 bytes, linear) Dec 16 12:11:10.459424 kernel: tcp_listen_portaddr_hash hash table entries: 8192 (order: 5, 131072 bytes, linear) Dec 16 12:11:10.459435 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Dec 16 12:11:10.459443 kernel: TCP established hash table entries: 131072 (order: 8, 1048576 bytes, linear) Dec 16 12:11:10.459452 kernel: TCP bind hash table entries: 65536 (order: 9, 2097152 bytes, linear) Dec 16 12:11:10.459461 kernel: TCP: Hash tables configured (established 131072 bind 65536) Dec 16 12:11:10.459470 kernel: UDP hash table entries: 8192 (order: 6, 262144 bytes, linear) Dec 16 12:11:10.459479 kernel: UDP-Lite hash table entries: 8192 (order: 6, 262144 bytes, linear) Dec 16 12:11:10.459488 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Dec 16 12:11:10.459583 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Dec 16 12:11:10.459596 kernel: PCI: CLS 0 bytes, default 64 Dec 16 12:11:10.459616 kernel: kvm [1]: HYP mode not available Dec 16 12:11:10.459638 kernel: Initialise system trusted keyrings Dec 16 12:11:10.459647 kernel: workingset: timestamp_bits=39 max_order=22 bucket_order=0 Dec 16 12:11:10.459656 kernel: Key type asymmetric registered Dec 16 12:11:10.459665 kernel: Asymmetric key parser 'x509' registered Dec 16 12:11:10.459677 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Dec 16 12:11:10.459686 kernel: io scheduler mq-deadline registered Dec 16 12:11:10.459695 kernel: io scheduler kyber registered Dec 16 12:11:10.459703 kernel: io scheduler bfq registered Dec 16 12:11:10.459713 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Dec 16 12:11:10.459811 kernel: pcieport 0000:00:01.0: PME: Signaling with IRQ 50 Dec 16 12:11:10.459900 kernel: pcieport 0000:00:01.0: AER: enabled with IRQ 50 Dec 16 12:11:10.459988 kernel: pcieport 0000:00:01.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.460074 kernel: pcieport 0000:00:01.1: PME: Signaling with IRQ 51 Dec 16 12:11:10.460159 kernel: pcieport 0000:00:01.1: AER: enabled with IRQ 51 Dec 16 12:11:10.460244 kernel: pcieport 0000:00:01.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.460331 kernel: pcieport 0000:00:01.2: PME: Signaling with IRQ 52 Dec 16 12:11:10.460414 kernel: pcieport 0000:00:01.2: AER: enabled with IRQ 52 Dec 16 12:11:10.460500 kernel: pcieport 0000:00:01.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.460597 kernel: pcieport 0000:00:01.3: PME: Signaling with IRQ 53 Dec 16 12:11:10.460707 kernel: pcieport 0000:00:01.3: AER: enabled with IRQ 53 Dec 16 12:11:10.460794 kernel: pcieport 0000:00:01.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.460883 kernel: pcieport 0000:00:01.4: PME: Signaling with IRQ 54 Dec 16 12:11:10.460967 kernel: pcieport 0000:00:01.4: AER: enabled with IRQ 54 Dec 16 12:11:10.461055 kernel: pcieport 0000:00:01.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.461148 kernel: pcieport 0000:00:01.5: PME: Signaling with IRQ 55 Dec 16 12:11:10.461236 kernel: pcieport 0000:00:01.5: AER: enabled with IRQ 55 Dec 16 12:11:10.461320 kernel: pcieport 0000:00:01.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.461414 kernel: pcieport 0000:00:01.6: PME: Signaling with IRQ 56 Dec 16 12:11:10.461504 kernel: pcieport 0000:00:01.6: AER: enabled with IRQ 56 Dec 16 12:11:10.461589 kernel: pcieport 0000:00:01.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.461693 kernel: pcieport 0000:00:01.7: PME: Signaling with IRQ 57 Dec 16 12:11:10.461781 kernel: pcieport 0000:00:01.7: AER: enabled with IRQ 57 Dec 16 12:11:10.461865 kernel: pcieport 0000:00:01.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.461877 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Dec 16 12:11:10.461959 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 58 Dec 16 12:11:10.462044 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 58 Dec 16 12:11:10.462131 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.462218 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 59 Dec 16 12:11:10.462303 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 59 Dec 16 12:11:10.462396 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.462487 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 60 Dec 16 12:11:10.462582 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 60 Dec 16 12:11:10.462698 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.462790 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 61 Dec 16 12:11:10.462881 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 61 Dec 16 12:11:10.462965 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.463052 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 62 Dec 16 12:11:10.463147 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 62 Dec 16 12:11:10.463233 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.463324 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 63 Dec 16 12:11:10.463411 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 63 Dec 16 12:11:10.463500 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.463586 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 64 Dec 16 12:11:10.463698 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 64 Dec 16 12:11:10.463785 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.463876 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 65 Dec 16 12:11:10.463964 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 65 Dec 16 12:11:10.464052 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.464064 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Dec 16 12:11:10.464147 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 66 Dec 16 12:11:10.464232 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 66 Dec 16 12:11:10.464327 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.464417 kernel: pcieport 0000:00:03.1: PME: Signaling with IRQ 67 Dec 16 12:11:10.464507 kernel: pcieport 0000:00:03.1: AER: enabled with IRQ 67 Dec 16 12:11:10.464592 kernel: pcieport 0000:00:03.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.464694 kernel: pcieport 0000:00:03.2: PME: Signaling with IRQ 68 Dec 16 12:11:10.464783 kernel: pcieport 0000:00:03.2: AER: enabled with IRQ 68 Dec 16 12:11:10.464867 kernel: pcieport 0000:00:03.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.464961 kernel: pcieport 0000:00:03.3: PME: Signaling with IRQ 69 Dec 16 12:11:10.465052 kernel: pcieport 0000:00:03.3: AER: enabled with IRQ 69 Dec 16 12:11:10.465136 kernel: pcieport 0000:00:03.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.465221 kernel: pcieport 0000:00:03.4: PME: Signaling with IRQ 70 Dec 16 12:11:10.465307 kernel: pcieport 0000:00:03.4: AER: enabled with IRQ 70 Dec 16 12:11:10.465393 kernel: pcieport 0000:00:03.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.465483 kernel: pcieport 0000:00:03.5: PME: Signaling with IRQ 71 Dec 16 12:11:10.465569 kernel: pcieport 0000:00:03.5: AER: enabled with IRQ 71 Dec 16 12:11:10.465675 kernel: pcieport 0000:00:03.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.465766 kernel: pcieport 0000:00:03.6: PME: Signaling with IRQ 72 Dec 16 12:11:10.465852 kernel: pcieport 0000:00:03.6: AER: enabled with IRQ 72 Dec 16 12:11:10.465937 kernel: pcieport 0000:00:03.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.466028 kernel: pcieport 0000:00:03.7: PME: Signaling with IRQ 73 Dec 16 12:11:10.466115 kernel: pcieport 0000:00:03.7: AER: enabled with IRQ 73 Dec 16 12:11:10.466199 kernel: pcieport 0000:00:03.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.466211 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Dec 16 12:11:10.466297 kernel: pcieport 0000:00:04.0: PME: Signaling with IRQ 74 Dec 16 12:11:10.466382 kernel: pcieport 0000:00:04.0: AER: enabled with IRQ 74 Dec 16 12:11:10.466467 kernel: pcieport 0000:00:04.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.466555 kernel: pcieport 0000:00:04.1: PME: Signaling with IRQ 75 Dec 16 12:11:10.466655 kernel: pcieport 0000:00:04.1: AER: enabled with IRQ 75 Dec 16 12:11:10.466742 kernel: pcieport 0000:00:04.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.466831 kernel: pcieport 0000:00:04.2: PME: Signaling with IRQ 76 Dec 16 12:11:10.466916 kernel: pcieport 0000:00:04.2: AER: enabled with IRQ 76 Dec 16 12:11:10.467000 kernel: pcieport 0000:00:04.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.467090 kernel: pcieport 0000:00:04.3: PME: Signaling with IRQ 77 Dec 16 12:11:10.467175 kernel: pcieport 0000:00:04.3: AER: enabled with IRQ 77 Dec 16 12:11:10.467261 kernel: pcieport 0000:00:04.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.467349 kernel: pcieport 0000:00:04.4: PME: Signaling with IRQ 78 Dec 16 12:11:10.467434 kernel: pcieport 0000:00:04.4: AER: enabled with IRQ 78 Dec 16 12:11:10.467518 kernel: pcieport 0000:00:04.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.467621 kernel: pcieport 0000:00:04.5: PME: Signaling with IRQ 79 Dec 16 12:11:10.467736 kernel: pcieport 0000:00:04.5: AER: enabled with IRQ 79 Dec 16 12:11:10.467822 kernel: pcieport 0000:00:04.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.467909 kernel: pcieport 0000:00:04.6: PME: Signaling with IRQ 80 Dec 16 12:11:10.467995 kernel: pcieport 0000:00:04.6: AER: enabled with IRQ 80 Dec 16 12:11:10.468079 kernel: pcieport 0000:00:04.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.468166 kernel: pcieport 0000:00:04.7: PME: Signaling with IRQ 81 Dec 16 12:11:10.468256 kernel: pcieport 0000:00:04.7: AER: enabled with IRQ 81 Dec 16 12:11:10.468342 kernel: pcieport 0000:00:04.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.468431 kernel: pcieport 0000:00:05.0: PME: Signaling with IRQ 82 Dec 16 12:11:10.468516 kernel: pcieport 0000:00:05.0: AER: enabled with IRQ 82 Dec 16 12:11:10.468600 kernel: pcieport 0000:00:05.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:11:10.468612 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Dec 16 12:11:10.468623 kernel: ACPI: button: Power Button [PWRB] Dec 16 12:11:10.468729 kernel: virtio-pci 0000:01:00.0: enabling device (0000 -> 0002) Dec 16 12:11:10.468823 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Dec 16 12:11:10.468835 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Dec 16 12:11:10.468843 kernel: thunder_xcv, ver 1.0 Dec 16 12:11:10.468852 kernel: thunder_bgx, ver 1.0 Dec 16 12:11:10.468860 kernel: nicpf, ver 1.0 Dec 16 12:11:10.468872 kernel: nicvf, ver 1.0 Dec 16 12:11:10.468977 kernel: rtc-efi rtc-efi.0: registered as rtc0 Dec 16 12:11:10.469059 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-12-16T12:11:09 UTC (1765887069) Dec 16 12:11:10.469070 kernel: hid: raw HID events driver (C) Jiri Kosina Dec 16 12:11:10.469079 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Dec 16 12:11:10.469088 kernel: watchdog: NMI not fully supported Dec 16 12:11:10.469098 kernel: watchdog: Hard watchdog permanently disabled Dec 16 12:11:10.469106 kernel: NET: Registered PF_INET6 protocol family Dec 16 12:11:10.469115 kernel: Segment Routing with IPv6 Dec 16 12:11:10.469123 kernel: In-situ OAM (IOAM) with IPv6 Dec 16 12:11:10.469132 kernel: NET: Registered PF_PACKET protocol family Dec 16 12:11:10.469140 kernel: Key type dns_resolver registered Dec 16 12:11:10.469148 kernel: registered taskstats version 1 Dec 16 12:11:10.469157 kernel: Loading compiled-in X.509 certificates Dec 16 12:11:10.469167 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.61-flatcar: 545838337a91b65b763486e536766b3eec3ef99d' Dec 16 12:11:10.469175 kernel: Demotion targets for Node 0: null Dec 16 12:11:10.469184 kernel: Key type .fscrypt registered Dec 16 12:11:10.469193 kernel: Key type fscrypt-provisioning registered Dec 16 12:11:10.469201 kernel: ima: No TPM chip found, activating TPM-bypass! Dec 16 12:11:10.469210 kernel: ima: Allocated hash algorithm: sha1 Dec 16 12:11:10.469218 kernel: ima: No architecture policies found Dec 16 12:11:10.469228 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Dec 16 12:11:10.469237 kernel: clk: Disabling unused clocks Dec 16 12:11:10.469246 kernel: PM: genpd: Disabling unused power domains Dec 16 12:11:10.469254 kernel: Freeing unused kernel memory: 12480K Dec 16 12:11:10.469262 kernel: Run /init as init process Dec 16 12:11:10.469271 kernel: with arguments: Dec 16 12:11:10.469279 kernel: /init Dec 16 12:11:10.469289 kernel: with environment: Dec 16 12:11:10.469297 kernel: HOME=/ Dec 16 12:11:10.469306 kernel: TERM=linux Dec 16 12:11:10.469315 kernel: ACPI: bus type USB registered Dec 16 12:11:10.469323 kernel: usbcore: registered new interface driver usbfs Dec 16 12:11:10.469332 kernel: usbcore: registered new interface driver hub Dec 16 12:11:10.469340 kernel: usbcore: registered new device driver usb Dec 16 12:11:10.469435 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 16 12:11:10.469525 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Dec 16 12:11:10.469612 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Dec 16 12:11:10.469725 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 16 12:11:10.469813 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Dec 16 12:11:10.469902 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Dec 16 12:11:10.470013 kernel: hub 1-0:1.0: USB hub found Dec 16 12:11:10.470119 kernel: hub 1-0:1.0: 4 ports detected Dec 16 12:11:10.470230 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Dec 16 12:11:10.470333 kernel: hub 2-0:1.0: USB hub found Dec 16 12:11:10.470429 kernel: hub 2-0:1.0: 4 ports detected Dec 16 12:11:10.470530 kernel: virtio_blk virtio1: 4/0/0 default/read/poll queues Dec 16 12:11:10.470620 kernel: virtio_blk virtio1: [vda] 104857600 512-byte logical blocks (53.7 GB/50.0 GiB) Dec 16 12:11:10.470645 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Dec 16 12:11:10.470655 kernel: GPT:25804799 != 104857599 Dec 16 12:11:10.470664 kernel: GPT:Alternate GPT header not at the end of the disk. Dec 16 12:11:10.470673 kernel: GPT:25804799 != 104857599 Dec 16 12:11:10.470681 kernel: GPT: Use GNU Parted to correct GPT errors. Dec 16 12:11:10.470692 kernel: vda: vda1 vda2 vda3 vda4 vda6 vda7 vda9 Dec 16 12:11:10.470701 kernel: SCSI subsystem initialized Dec 16 12:11:10.470710 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Dec 16 12:11:10.470719 kernel: device-mapper: uevent: version 1.0.3 Dec 16 12:11:10.470729 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Dec 16 12:11:10.470738 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Dec 16 12:11:10.470749 kernel: raid6: neonx8 gen() 15832 MB/s Dec 16 12:11:10.470759 kernel: raid6: neonx4 gen() 15754 MB/s Dec 16 12:11:10.470768 kernel: raid6: neonx2 gen() 13261 MB/s Dec 16 12:11:10.470776 kernel: raid6: neonx1 gen() 10400 MB/s Dec 16 12:11:10.470785 kernel: raid6: int64x8 gen() 6834 MB/s Dec 16 12:11:10.470794 kernel: raid6: int64x4 gen() 7365 MB/s Dec 16 12:11:10.470803 kernel: raid6: int64x2 gen() 6118 MB/s Dec 16 12:11:10.470811 kernel: raid6: int64x1 gen() 5071 MB/s Dec 16 12:11:10.470822 kernel: raid6: using algorithm neonx8 gen() 15832 MB/s Dec 16 12:11:10.470831 kernel: raid6: .... xor() 12090 MB/s, rmw enabled Dec 16 12:11:10.470840 kernel: raid6: using neon recovery algorithm Dec 16 12:11:10.470849 kernel: xor: measuring software checksum speed Dec 16 12:11:10.470860 kernel: 8regs : 21624 MB/sec Dec 16 12:11:10.470869 kernel: 32regs : 21699 MB/sec Dec 16 12:11:10.470879 kernel: arm64_neon : 26424 MB/sec Dec 16 12:11:10.470888 kernel: xor: using function: arm64_neon (26424 MB/sec) Dec 16 12:11:10.471006 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Dec 16 12:11:10.471021 kernel: Btrfs loaded, zoned=no, fsverity=no Dec 16 12:11:10.471030 kernel: BTRFS: device fsid d00a2bc5-1c68-4957-aa37-d070193fcf05 devid 1 transid 36 /dev/mapper/usr (253:0) scanned by mount (276) Dec 16 12:11:10.471040 kernel: BTRFS info (device dm-0): first mount of filesystem d00a2bc5-1c68-4957-aa37-d070193fcf05 Dec 16 12:11:10.471049 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Dec 16 12:11:10.471061 kernel: BTRFS info (device dm-0): disabling log replay at mount time Dec 16 12:11:10.471070 kernel: BTRFS info (device dm-0): enabling free space tree Dec 16 12:11:10.471079 kernel: loop: module loaded Dec 16 12:11:10.471088 kernel: loop0: detected capacity change from 0 to 91832 Dec 16 12:11:10.471097 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Dec 16 12:11:10.471205 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Dec 16 12:11:10.471221 systemd[1]: Successfully made /usr/ read-only. Dec 16 12:11:10.471233 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 16 12:11:10.471243 systemd[1]: Detected virtualization kvm. Dec 16 12:11:10.471252 systemd[1]: Detected architecture arm64. Dec 16 12:11:10.471261 systemd[1]: Running in initrd. Dec 16 12:11:10.471270 systemd[1]: No hostname configured, using default hostname. Dec 16 12:11:10.471281 systemd[1]: Hostname set to . Dec 16 12:11:10.471290 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Dec 16 12:11:10.471299 systemd[1]: Queued start job for default target initrd.target. Dec 16 12:11:10.471308 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Dec 16 12:11:10.471317 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 12:11:10.471327 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 12:11:10.471338 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Dec 16 12:11:10.471348 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 16 12:11:10.471359 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Dec 16 12:11:10.471369 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Dec 16 12:11:10.471379 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 12:11:10.471388 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 16 12:11:10.471399 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Dec 16 12:11:10.471408 systemd[1]: Reached target paths.target - Path Units. Dec 16 12:11:10.471418 systemd[1]: Reached target slices.target - Slice Units. Dec 16 12:11:10.471427 systemd[1]: Reached target swap.target - Swaps. Dec 16 12:11:10.471436 systemd[1]: Reached target timers.target - Timer Units. Dec 16 12:11:10.471446 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Dec 16 12:11:10.471455 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 16 12:11:10.471467 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Dec 16 12:11:10.471476 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Dec 16 12:11:10.471486 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Dec 16 12:11:10.471495 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 16 12:11:10.471505 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 16 12:11:10.471514 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 12:11:10.471525 systemd[1]: Reached target sockets.target - Socket Units. Dec 16 12:11:10.471535 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Dec 16 12:11:10.471544 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Dec 16 12:11:10.471553 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 16 12:11:10.471563 systemd[1]: Finished network-cleanup.service - Network Cleanup. Dec 16 12:11:10.471572 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Dec 16 12:11:10.471581 systemd[1]: Starting systemd-fsck-usr.service... Dec 16 12:11:10.471592 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 16 12:11:10.471601 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 16 12:11:10.471625 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:11:10.471658 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Dec 16 12:11:10.471671 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 12:11:10.471681 systemd[1]: Finished systemd-fsck-usr.service. Dec 16 12:11:10.471690 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 16 12:11:10.471726 systemd-journald[416]: Collecting audit messages is enabled. Dec 16 12:11:10.471751 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Dec 16 12:11:10.471760 kernel: Bridge firewalling registered Dec 16 12:11:10.471769 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 16 12:11:10.471779 kernel: audit: type=1130 audit(1765887070.403:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:10.471789 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:11:10.471801 kernel: audit: type=1130 audit(1765887070.408:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:10.471810 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Dec 16 12:11:10.471820 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 16 12:11:10.471830 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 12:11:10.471839 kernel: audit: type=1130 audit(1765887070.420:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:10.471849 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 16 12:11:10.471860 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 16 12:11:10.471870 kernel: audit: type=1130 audit(1765887070.435:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:10.471879 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 12:11:10.471891 kernel: audit: type=1130 audit(1765887070.441:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:10.471900 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 16 12:11:10.471910 kernel: audit: type=1130 audit(1765887070.445:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:10.471921 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Dec 16 12:11:10.471930 kernel: audit: type=1334 audit(1765887070.450:8): prog-id=6 op=LOAD Dec 16 12:11:10.471939 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 16 12:11:10.471949 systemd-journald[416]: Journal started Dec 16 12:11:10.471969 systemd-journald[416]: Runtime Journal (/run/log/journal/b8ab6db6e2474144b7213534fbb295d8) is 8M, max 319.5M, 311.5M free. Dec 16 12:11:10.403000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:10.408000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:10.420000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:10.435000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:10.441000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:10.445000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:10.450000 audit: BPF prog-id=6 op=LOAD Dec 16 12:11:10.399943 systemd-modules-load[418]: Inserted module 'br_netfilter' Dec 16 12:11:10.474193 systemd[1]: Started systemd-journald.service - Journal Service. Dec 16 12:11:10.474650 dracut-cmdline[445]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=openstack verity.usrhash=756b815c2fd7ac2947efceb2a88878d1ea9723ec85037c2b4d1a09bd798bb749 Dec 16 12:11:10.482701 kernel: audit: type=1130 audit(1765887070.474:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:10.474000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:10.479036 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 16 12:11:10.497232 systemd-tmpfiles[473]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Dec 16 12:11:10.500602 systemd-resolved[446]: Positive Trust Anchors: Dec 16 12:11:10.500623 systemd-resolved[446]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 16 12:11:10.502000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:10.500634 systemd-resolved[446]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Dec 16 12:11:10.508437 kernel: audit: type=1130 audit(1765887070.502:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:10.500665 systemd-resolved[446]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 16 12:11:10.501515 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 12:11:10.528573 systemd-resolved[446]: Defaulting to hostname 'linux'. Dec 16 12:11:10.530000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:10.529686 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 16 12:11:10.530730 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 16 12:11:10.570682 kernel: Loading iSCSI transport class v2.0-870. Dec 16 12:11:10.580662 kernel: iscsi: registered transport (tcp) Dec 16 12:11:10.594676 kernel: iscsi: registered transport (qla4xxx) Dec 16 12:11:10.594735 kernel: QLogic iSCSI HBA Driver Dec 16 12:11:10.615507 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 16 12:11:10.637927 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 12:11:10.639000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:10.641411 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 16 12:11:10.686179 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Dec 16 12:11:10.686000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:10.688591 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Dec 16 12:11:10.690111 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Dec 16 12:11:10.732383 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Dec 16 12:11:10.732000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:10.733000 audit: BPF prog-id=7 op=LOAD Dec 16 12:11:10.733000 audit: BPF prog-id=8 op=LOAD Dec 16 12:11:10.734958 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 12:11:10.767076 systemd-udevd[697]: Using default interface naming scheme 'v257'. Dec 16 12:11:10.774913 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 12:11:10.775000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:10.778406 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Dec 16 12:11:10.799535 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 16 12:11:10.800000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:10.801000 audit: BPF prog-id=9 op=LOAD Dec 16 12:11:10.802437 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 16 12:11:10.808303 dracut-pre-trigger[775]: rd.md=0: removing MD RAID activation Dec 16 12:11:10.831535 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Dec 16 12:11:10.832000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:10.833978 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 16 12:11:10.845820 systemd-networkd[804]: lo: Link UP Dec 16 12:11:10.845824 systemd-networkd[804]: lo: Gained carrier Dec 16 12:11:10.847000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:10.846263 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 16 12:11:10.847743 systemd[1]: Reached target network.target - Network. Dec 16 12:11:10.919713 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 12:11:10.920000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:10.922832 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Dec 16 12:11:10.993452 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM. Dec 16 12:11:11.008486 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT. Dec 16 12:11:11.032655 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Dec 16 12:11:11.041687 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Dec 16 12:11:11.052653 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:01.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Dec 16 12:11:11.054908 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A. Dec 16 12:11:11.058020 systemd-networkd[804]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:11:11.058036 systemd-networkd[804]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 12:11:11.059190 systemd-networkd[804]: eth0: Link UP Dec 16 12:11:11.059648 systemd-networkd[804]: eth0: Gained carrier Dec 16 12:11:11.059660 systemd-networkd[804]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:11:11.063044 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Dec 16 12:11:11.067721 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Dec 16 12:11:11.068983 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 12:11:11.069000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:11.069128 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:11:11.070657 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:11:11.075508 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:11:11.089896 disk-uuid[880]: Primary Header is updated. Dec 16 12:11:11.089896 disk-uuid[880]: Secondary Entries is updated. Dec 16 12:11:11.089896 disk-uuid[880]: Secondary Header is updated. Dec 16 12:11:11.095477 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:11:11.097000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:11.107654 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Dec 16 12:11:11.107878 kernel: usbcore: registered new interface driver usbhid Dec 16 12:11:11.111667 kernel: usbhid: USB HID core driver Dec 16 12:11:11.116757 systemd-networkd[804]: eth0: DHCPv4 address 10.0.21.106/25, gateway 10.0.21.1 acquired from 10.0.21.1 Dec 16 12:11:11.171731 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Dec 16 12:11:11.172000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:11.173702 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Dec 16 12:11:11.175295 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 12:11:11.177323 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 16 12:11:11.180141 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Dec 16 12:11:11.201543 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Dec 16 12:11:11.202000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:12.129072 disk-uuid[882]: Warning: The kernel is still using the old partition table. Dec 16 12:11:12.129072 disk-uuid[882]: The new table will be used at the next reboot or after you Dec 16 12:11:12.129072 disk-uuid[882]: run partprobe(8) or kpartx(8) Dec 16 12:11:12.129072 disk-uuid[882]: The operation has completed successfully. Dec 16 12:11:12.139992 systemd[1]: disk-uuid.service: Deactivated successfully. Dec 16 12:11:12.140000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:12.140000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:12.140098 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Dec 16 12:11:12.143246 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Dec 16 12:11:12.191704 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (909) Dec 16 12:11:12.194027 kernel: BTRFS info (device vda6): first mount of filesystem eb4bb268-dde2-45a9-b660-8899d8790a47 Dec 16 12:11:12.194121 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Dec 16 12:11:12.198738 kernel: BTRFS info (device vda6): turning on async discard Dec 16 12:11:12.198816 kernel: BTRFS info (device vda6): enabling free space tree Dec 16 12:11:12.204647 kernel: BTRFS info (device vda6): last unmount of filesystem eb4bb268-dde2-45a9-b660-8899d8790a47 Dec 16 12:11:12.205118 systemd[1]: Finished ignition-setup.service - Ignition (setup). Dec 16 12:11:12.205000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:12.207335 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Dec 16 12:11:12.248832 systemd-networkd[804]: eth0: Gained IPv6LL Dec 16 12:11:12.334679 ignition[928]: Ignition 2.24.0 Dec 16 12:11:12.334690 ignition[928]: Stage: fetch-offline Dec 16 12:11:12.334731 ignition[928]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:11:12.334741 ignition[928]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 12:11:12.337215 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Dec 16 12:11:12.338000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:12.334898 ignition[928]: parsed url from cmdline: "" Dec 16 12:11:12.334901 ignition[928]: no config URL provided Dec 16 12:11:12.335586 ignition[928]: reading system config file "/usr/lib/ignition/user.ign" Dec 16 12:11:12.340789 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Dec 16 12:11:12.335594 ignition[928]: no config at "/usr/lib/ignition/user.ign" Dec 16 12:11:12.335599 ignition[928]: failed to fetch config: resource requires networking Dec 16 12:11:12.335772 ignition[928]: Ignition finished successfully Dec 16 12:11:12.365664 ignition[941]: Ignition 2.24.0 Dec 16 12:11:12.365680 ignition[941]: Stage: fetch Dec 16 12:11:12.365820 ignition[941]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:11:12.365828 ignition[941]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 12:11:12.365904 ignition[941]: parsed url from cmdline: "" Dec 16 12:11:12.365908 ignition[941]: no config URL provided Dec 16 12:11:12.365912 ignition[941]: reading system config file "/usr/lib/ignition/user.ign" Dec 16 12:11:12.365917 ignition[941]: no config at "/usr/lib/ignition/user.ign" Dec 16 12:11:12.366173 ignition[941]: config drive ("/dev/disk/by-label/config-2") not found. Waiting... Dec 16 12:11:12.366192 ignition[941]: config drive ("/dev/disk/by-label/CONFIG-2") not found. Waiting... Dec 16 12:11:12.366223 ignition[941]: GET http://169.254.169.254/openstack/latest/user_data: attempt #1 Dec 16 12:11:12.742740 ignition[941]: GET result: OK Dec 16 12:11:12.743038 ignition[941]: parsing config with SHA512: 6fb8e646b52b57e1e234aecd1111d048269323e76f7d69681ff15a46355b22c04fcd181e98c0eceef89cef28b902f00ad2b9123d6089b37fe4884cdc87bc67f2 Dec 16 12:11:12.747918 unknown[941]: fetched base config from "system" Dec 16 12:11:12.747931 unknown[941]: fetched base config from "system" Dec 16 12:11:12.748260 ignition[941]: fetch: fetch complete Dec 16 12:11:12.754868 kernel: kauditd_printk_skb: 20 callbacks suppressed Dec 16 12:11:12.754893 kernel: audit: type=1130 audit(1765887072.750:31): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:12.750000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:12.747936 unknown[941]: fetched user config from "openstack" Dec 16 12:11:12.748264 ignition[941]: fetch: fetch passed Dec 16 12:11:12.750686 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Dec 16 12:11:12.748303 ignition[941]: Ignition finished successfully Dec 16 12:11:12.753047 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Dec 16 12:11:12.775976 ignition[950]: Ignition 2.24.0 Dec 16 12:11:12.775995 ignition[950]: Stage: kargs Dec 16 12:11:12.776139 ignition[950]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:11:12.776147 ignition[950]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 12:11:12.776976 ignition[950]: kargs: kargs passed Dec 16 12:11:12.779262 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Dec 16 12:11:12.783666 kernel: audit: type=1130 audit(1765887072.779:32): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:12.779000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:12.777022 ignition[950]: Ignition finished successfully Dec 16 12:11:12.783760 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Dec 16 12:11:12.807552 ignition[957]: Ignition 2.24.0 Dec 16 12:11:12.807573 ignition[957]: Stage: disks Dec 16 12:11:12.807742 ignition[957]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:11:12.807750 ignition[957]: no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 12:11:12.808468 ignition[957]: disks: disks passed Dec 16 12:11:12.811712 systemd[1]: Finished ignition-disks.service - Ignition (disks). Dec 16 12:11:12.812000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:12.808507 ignition[957]: Ignition finished successfully Dec 16 12:11:12.817514 kernel: audit: type=1130 audit(1765887072.812:33): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:12.813316 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Dec 16 12:11:12.816959 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Dec 16 12:11:12.818578 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 16 12:11:12.820364 systemd[1]: Reached target sysinit.target - System Initialization. Dec 16 12:11:12.822189 systemd[1]: Reached target basic.target - Basic System. Dec 16 12:11:12.824566 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Dec 16 12:11:12.874968 systemd-fsck[966]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Dec 16 12:11:12.878346 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Dec 16 12:11:12.879000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:12.881652 systemd[1]: Mounting sysroot.mount - /sysroot... Dec 16 12:11:12.885405 kernel: audit: type=1130 audit(1765887072.879:34): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:12.977656 kernel: EXT4-fs (vda9): mounted filesystem 0e69f709-36a9-4e15-b0c9-c7e150185653 r/w with ordered data mode. Quota mode: none. Dec 16 12:11:12.977895 systemd[1]: Mounted sysroot.mount - /sysroot. Dec 16 12:11:12.979106 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Dec 16 12:11:12.983797 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 16 12:11:12.986466 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Dec 16 12:11:12.987470 systemd[1]: flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent was skipped because no trigger condition checks were met. Dec 16 12:11:13.001614 systemd[1]: Starting flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent... Dec 16 12:11:13.002715 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Dec 16 12:11:13.002748 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Dec 16 12:11:13.004667 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Dec 16 12:11:13.006872 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Dec 16 12:11:13.019652 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (974) Dec 16 12:11:13.023748 kernel: BTRFS info (device vda6): first mount of filesystem eb4bb268-dde2-45a9-b660-8899d8790a47 Dec 16 12:11:13.023795 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Dec 16 12:11:13.031472 kernel: BTRFS info (device vda6): turning on async discard Dec 16 12:11:13.031515 kernel: BTRFS info (device vda6): enabling free space tree Dec 16 12:11:13.032894 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 16 12:11:13.065674 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:11:13.172531 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Dec 16 12:11:13.173000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:13.174576 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Dec 16 12:11:13.178771 kernel: audit: type=1130 audit(1765887073.173:35): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:13.178679 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Dec 16 12:11:13.191513 systemd[1]: sysroot-oem.mount: Deactivated successfully. Dec 16 12:11:13.194651 kernel: BTRFS info (device vda6): last unmount of filesystem eb4bb268-dde2-45a9-b660-8899d8790a47 Dec 16 12:11:13.212412 ignition[1075]: INFO : Ignition 2.24.0 Dec 16 12:11:13.212412 ignition[1075]: INFO : Stage: mount Dec 16 12:11:13.214150 ignition[1075]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 12:11:13.214150 ignition[1075]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 12:11:13.214150 ignition[1075]: INFO : mount: mount passed Dec 16 12:11:13.214150 ignition[1075]: INFO : Ignition finished successfully Dec 16 12:11:13.223862 kernel: audit: type=1130 audit(1765887073.215:36): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:13.223890 kernel: audit: type=1130 audit(1765887073.220:37): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:13.215000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:13.220000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:13.214446 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Dec 16 12:11:13.216065 systemd[1]: Finished ignition-mount.service - Ignition (mount). Dec 16 12:11:14.097699 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:11:16.102736 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:11:20.109671 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:11:20.114738 coreos-metadata[976]: Dec 16 12:11:20.114 WARN failed to locate config-drive, using the metadata service API instead Dec 16 12:11:20.132754 coreos-metadata[976]: Dec 16 12:11:20.132 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Dec 16 12:11:21.017549 coreos-metadata[976]: Dec 16 12:11:21.017 INFO Fetch successful Dec 16 12:11:21.018587 coreos-metadata[976]: Dec 16 12:11:21.017 INFO wrote hostname ci-4547-0-0-4-c6e23b3406 to /sysroot/etc/hostname Dec 16 12:11:21.020155 systemd[1]: flatcar-openstack-hostname.service: Deactivated successfully. Dec 16 12:11:21.020250 systemd[1]: Finished flatcar-openstack-hostname.service - Flatcar OpenStack Metadata Hostname Agent. Dec 16 12:11:21.029229 kernel: audit: type=1130 audit(1765887081.022:38): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:21.029255 kernel: audit: type=1131 audit(1765887081.022:39): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:21.022000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:21.022000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-openstack-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:21.024369 systemd[1]: Starting ignition-files.service - Ignition (files)... Dec 16 12:11:21.045313 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 16 12:11:21.064644 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/vda6 (254:6) scanned by mount (1092) Dec 16 12:11:21.066834 kernel: BTRFS info (device vda6): first mount of filesystem eb4bb268-dde2-45a9-b660-8899d8790a47 Dec 16 12:11:21.066884 kernel: BTRFS info (device vda6): using crc32c (crc32c-generic) checksum algorithm Dec 16 12:11:21.075940 kernel: BTRFS info (device vda6): turning on async discard Dec 16 12:11:21.076012 kernel: BTRFS info (device vda6): enabling free space tree Dec 16 12:11:21.077383 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 16 12:11:21.109621 ignition[1110]: INFO : Ignition 2.24.0 Dec 16 12:11:21.109621 ignition[1110]: INFO : Stage: files Dec 16 12:11:21.111291 ignition[1110]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 12:11:21.111291 ignition[1110]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 12:11:21.111291 ignition[1110]: DEBUG : files: compiled without relabeling support, skipping Dec 16 12:11:21.114600 ignition[1110]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Dec 16 12:11:21.114600 ignition[1110]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Dec 16 12:11:21.117171 ignition[1110]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Dec 16 12:11:21.117171 ignition[1110]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Dec 16 12:11:21.117171 ignition[1110]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Dec 16 12:11:21.116373 unknown[1110]: wrote ssh authorized keys file for user: core Dec 16 12:11:21.122212 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Dec 16 12:11:21.122212 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Dec 16 12:11:21.198971 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Dec 16 12:11:21.638690 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Dec 16 12:11:21.638690 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Dec 16 12:11:21.643070 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Dec 16 12:11:21.643070 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Dec 16 12:11:21.643070 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Dec 16 12:11:21.643070 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 16 12:11:21.643070 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 16 12:11:21.643070 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 16 12:11:21.643070 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 16 12:11:21.643070 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Dec 16 12:11:21.643070 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Dec 16 12:11:21.643070 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Dec 16 12:11:21.658756 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Dec 16 12:11:21.658756 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Dec 16 12:11:21.658756 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.33.0-arm64.raw: attempt #1 Dec 16 12:11:22.064140 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Dec 16 12:11:23.682177 ignition[1110]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.33.0-arm64.raw" Dec 16 12:11:23.682177 ignition[1110]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Dec 16 12:11:23.686626 ignition[1110]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 16 12:11:23.688717 ignition[1110]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 16 12:11:23.688717 ignition[1110]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Dec 16 12:11:23.688717 ignition[1110]: INFO : files: op(d): [started] setting preset to enabled for "prepare-helm.service" Dec 16 12:11:23.688717 ignition[1110]: INFO : files: op(d): [finished] setting preset to enabled for "prepare-helm.service" Dec 16 12:11:23.688717 ignition[1110]: INFO : files: createResultFile: createFiles: op(e): [started] writing file "/sysroot/etc/.ignition-result.json" Dec 16 12:11:23.688717 ignition[1110]: INFO : files: createResultFile: createFiles: op(e): [finished] writing file "/sysroot/etc/.ignition-result.json" Dec 16 12:11:23.688717 ignition[1110]: INFO : files: files passed Dec 16 12:11:23.688717 ignition[1110]: INFO : Ignition finished successfully Dec 16 12:11:23.703699 kernel: audit: type=1130 audit(1765887083.691:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.691000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.690586 systemd[1]: Finished ignition-files.service - Ignition (files). Dec 16 12:11:23.692871 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Dec 16 12:11:23.716964 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Dec 16 12:11:23.719984 systemd[1]: ignition-quench.service: Deactivated successfully. Dec 16 12:11:23.720084 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Dec 16 12:11:23.727085 kernel: audit: type=1130 audit(1765887083.721:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.727113 kernel: audit: type=1131 audit(1765887083.721:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.721000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.721000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.748757 initrd-setup-root-after-ignition[1145]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 16 12:11:23.748757 initrd-setup-root-after-ignition[1145]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Dec 16 12:11:23.751816 initrd-setup-root-after-ignition[1149]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 16 12:11:23.752891 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 16 12:11:23.753000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.754702 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Dec 16 12:11:23.760036 kernel: audit: type=1130 audit(1765887083.753:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.759961 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Dec 16 12:11:23.838017 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Dec 16 12:11:23.838142 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Dec 16 12:11:23.839000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.839000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.840297 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Dec 16 12:11:23.847328 kernel: audit: type=1130 audit(1765887083.839:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.847355 kernel: audit: type=1131 audit(1765887083.839:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.846511 systemd[1]: Reached target initrd.target - Initrd Default Target. Dec 16 12:11:23.848413 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Dec 16 12:11:23.849312 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Dec 16 12:11:23.864431 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 16 12:11:23.864000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.868754 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Dec 16 12:11:23.871354 kernel: audit: type=1130 audit(1765887083.864:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.884808 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Dec 16 12:11:23.885018 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Dec 16 12:11:23.887315 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 12:11:23.889270 systemd[1]: Stopped target timers.target - Timer Units. Dec 16 12:11:23.890909 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Dec 16 12:11:23.891000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.891033 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 16 12:11:23.896570 kernel: audit: type=1131 audit(1765887083.891:47): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.895684 systemd[1]: Stopped target initrd.target - Initrd Default Target. Dec 16 12:11:23.897816 systemd[1]: Stopped target basic.target - Basic System. Dec 16 12:11:23.899540 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Dec 16 12:11:23.901197 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Dec 16 12:11:23.902975 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Dec 16 12:11:23.904739 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Dec 16 12:11:23.906642 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Dec 16 12:11:23.908457 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Dec 16 12:11:23.910346 systemd[1]: Stopped target sysinit.target - System Initialization. Dec 16 12:11:23.912214 systemd[1]: Stopped target local-fs.target - Local File Systems. Dec 16 12:11:23.913793 systemd[1]: Stopped target swap.target - Swaps. Dec 16 12:11:23.915202 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Dec 16 12:11:23.916000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.915327 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Dec 16 12:11:23.917426 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Dec 16 12:11:23.918526 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 12:11:23.920388 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Dec 16 12:11:23.923722 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 12:11:23.925094 systemd[1]: dracut-initqueue.service: Deactivated successfully. Dec 16 12:11:23.926000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.925212 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Dec 16 12:11:23.927904 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Dec 16 12:11:23.930000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.928026 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 16 12:11:23.931000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.930212 systemd[1]: ignition-files.service: Deactivated successfully. Dec 16 12:11:23.930308 systemd[1]: Stopped ignition-files.service - Ignition (files). Dec 16 12:11:23.932783 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Dec 16 12:11:23.937000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.934313 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Dec 16 12:11:23.938000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.935227 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Dec 16 12:11:23.940000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.935354 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 12:11:23.937291 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Dec 16 12:11:23.937432 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 12:11:23.939087 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Dec 16 12:11:23.939228 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Dec 16 12:11:23.950423 systemd[1]: initrd-cleanup.service: Deactivated successfully. Dec 16 12:11:23.950535 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Dec 16 12:11:23.952000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.952000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.956136 ignition[1170]: INFO : Ignition 2.24.0 Dec 16 12:11:23.956136 ignition[1170]: INFO : Stage: umount Dec 16 12:11:23.958716 ignition[1170]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 12:11:23.958716 ignition[1170]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/openstack" Dec 16 12:11:23.958716 ignition[1170]: INFO : umount: umount passed Dec 16 12:11:23.958716 ignition[1170]: INFO : Ignition finished successfully Dec 16 12:11:23.961000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.959673 systemd[1]: ignition-mount.service: Deactivated successfully. Dec 16 12:11:23.964000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.960676 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Dec 16 12:11:23.966000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.962197 systemd[1]: ignition-disks.service: Deactivated successfully. Dec 16 12:11:23.968000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.962245 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Dec 16 12:11:23.964497 systemd[1]: ignition-kargs.service: Deactivated successfully. Dec 16 12:11:23.971000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.964541 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Dec 16 12:11:23.966420 systemd[1]: ignition-fetch.service: Deactivated successfully. Dec 16 12:11:23.966468 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Dec 16 12:11:23.968814 systemd[1]: Stopped target network.target - Network. Dec 16 12:11:23.970276 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Dec 16 12:11:23.970340 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Dec 16 12:11:23.972109 systemd[1]: Stopped target paths.target - Path Units. Dec 16 12:11:23.973545 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Dec 16 12:11:23.976975 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 12:11:23.978079 systemd[1]: Stopped target slices.target - Slice Units. Dec 16 12:11:23.979612 systemd[1]: Stopped target sockets.target - Socket Units. Dec 16 12:11:23.988000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.981450 systemd[1]: iscsid.socket: Deactivated successfully. Dec 16 12:11:23.989000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.981488 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Dec 16 12:11:23.982916 systemd[1]: iscsiuio.socket: Deactivated successfully. Dec 16 12:11:23.982945 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 16 12:11:23.984508 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Dec 16 12:11:23.984529 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Dec 16 12:11:23.995000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.997000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.986647 systemd[1]: ignition-setup.service: Deactivated successfully. Dec 16 12:11:23.986702 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Dec 16 12:11:23.999000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.988257 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Dec 16 12:11:23.988300 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Dec 16 12:11:23.989899 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Dec 16 12:11:23.991455 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Dec 16 12:11:23.993808 systemd[1]: sysroot-boot.mount: Deactivated successfully. Dec 16 12:11:23.994331 systemd[1]: sysroot-boot.service: Deactivated successfully. Dec 16 12:11:23.994407 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Dec 16 12:11:24.006000 audit: BPF prog-id=6 op=UNLOAD Dec 16 12:11:24.006000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:23.995859 systemd[1]: initrd-setup-root.service: Deactivated successfully. Dec 16 12:11:23.995945 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Dec 16 12:11:23.998775 systemd[1]: systemd-resolved.service: Deactivated successfully. Dec 16 12:11:23.998862 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Dec 16 12:11:24.005972 systemd[1]: systemd-networkd.service: Deactivated successfully. Dec 16 12:11:24.006223 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Dec 16 12:11:24.010809 systemd[1]: Stopped target network-pre.target - Preparation for Network. Dec 16 12:11:24.017000 audit: BPF prog-id=9 op=UNLOAD Dec 16 12:11:24.017000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:24.011839 systemd[1]: systemd-networkd.socket: Deactivated successfully. Dec 16 12:11:24.018000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:24.011873 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Dec 16 12:11:24.020000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:24.014411 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Dec 16 12:11:24.015258 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Dec 16 12:11:24.015315 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 16 12:11:24.017342 systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 16 12:11:24.017403 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Dec 16 12:11:24.019092 systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 16 12:11:24.019135 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Dec 16 12:11:24.020730 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 12:11:24.042958 systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 16 12:11:24.043110 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 12:11:24.045000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:24.046075 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Dec 16 12:11:24.046111 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Dec 16 12:11:24.047703 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Dec 16 12:11:24.050000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:24.047732 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 12:11:24.049456 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Dec 16 12:11:24.053000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:24.049501 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Dec 16 12:11:24.052029 systemd[1]: dracut-cmdline.service: Deactivated successfully. Dec 16 12:11:24.055000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:24.052080 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Dec 16 12:11:24.054465 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Dec 16 12:11:24.054514 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 16 12:11:24.061297 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Dec 16 12:11:24.062336 systemd[1]: systemd-network-generator.service: Deactivated successfully. Dec 16 12:11:24.064000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:24.062401 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 12:11:24.066000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:24.065407 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Dec 16 12:11:24.068000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:24.065456 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 12:11:24.070000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:24.066773 systemd[1]: systemd-tmpfiles-setup-dev-early.service: Deactivated successfully. Dec 16 12:11:24.072000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:24.066818 systemd[1]: Stopped systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 12:11:24.068984 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Dec 16 12:11:24.075000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:24.069027 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 12:11:24.078000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:24.078000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:24.070796 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 12:11:24.070841 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:11:24.073315 systemd[1]: network-cleanup.service: Deactivated successfully. Dec 16 12:11:24.074666 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Dec 16 12:11:24.075968 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Dec 16 12:11:24.076040 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Dec 16 12:11:24.079378 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Dec 16 12:11:24.084088 systemd[1]: Starting initrd-switch-root.service - Switch Root... Dec 16 12:11:24.104605 systemd[1]: Switching root. Dec 16 12:11:24.135080 systemd-journald[416]: Journal stopped Dec 16 12:11:24.990885 systemd-journald[416]: Received SIGTERM from PID 1 (systemd). Dec 16 12:11:24.990963 kernel: SELinux: policy capability network_peer_controls=1 Dec 16 12:11:24.990987 kernel: SELinux: policy capability open_perms=1 Dec 16 12:11:24.990997 kernel: SELinux: policy capability extended_socket_class=1 Dec 16 12:11:24.991006 kernel: SELinux: policy capability always_check_network=0 Dec 16 12:11:24.991019 kernel: SELinux: policy capability cgroup_seclabel=1 Dec 16 12:11:24.991032 kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 16 12:11:24.991042 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Dec 16 12:11:24.991053 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Dec 16 12:11:24.991063 kernel: SELinux: policy capability userspace_initial_context=0 Dec 16 12:11:24.991076 systemd[1]: Successfully loaded SELinux policy in 62.737ms. Dec 16 12:11:24.991096 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 5.742ms. Dec 16 12:11:24.991110 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 16 12:11:24.991126 systemd[1]: Detected virtualization kvm. Dec 16 12:11:24.991139 systemd[1]: Detected architecture arm64. Dec 16 12:11:24.991150 systemd[1]: Detected first boot. Dec 16 12:11:24.991160 systemd[1]: Hostname set to . Dec 16 12:11:24.991173 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Dec 16 12:11:24.991183 zram_generator::config[1213]: No configuration found. Dec 16 12:11:24.991201 kernel: NET: Registered PF_VSOCK protocol family Dec 16 12:11:24.991213 systemd[1]: Populated /etc with preset unit settings. Dec 16 12:11:24.991224 systemd[1]: initrd-switch-root.service: Deactivated successfully. Dec 16 12:11:24.991234 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Dec 16 12:11:24.991245 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Dec 16 12:11:24.991257 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Dec 16 12:11:24.991267 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Dec 16 12:11:24.991280 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Dec 16 12:11:24.991291 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Dec 16 12:11:24.991302 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Dec 16 12:11:24.991313 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Dec 16 12:11:24.991324 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Dec 16 12:11:24.991334 systemd[1]: Created slice user.slice - User and Session Slice. Dec 16 12:11:24.991345 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 12:11:24.991357 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 12:11:24.991367 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Dec 16 12:11:24.991378 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Dec 16 12:11:24.991396 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Dec 16 12:11:24.991408 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 16 12:11:24.991421 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Dec 16 12:11:24.991432 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 12:11:24.991444 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 16 12:11:24.991457 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Dec 16 12:11:24.991468 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Dec 16 12:11:24.991479 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Dec 16 12:11:24.991490 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Dec 16 12:11:24.991503 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 12:11:24.991514 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 16 12:11:24.991524 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Dec 16 12:11:24.991535 systemd[1]: Reached target slices.target - Slice Units. Dec 16 12:11:24.991545 systemd[1]: Reached target swap.target - Swaps. Dec 16 12:11:24.991556 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Dec 16 12:11:24.991584 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Dec 16 12:11:24.991599 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Dec 16 12:11:24.991610 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Dec 16 12:11:24.991621 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Dec 16 12:11:24.991645 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 16 12:11:24.991657 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Dec 16 12:11:24.991668 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Dec 16 12:11:24.991679 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 16 12:11:24.991692 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 12:11:24.991703 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Dec 16 12:11:24.991714 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Dec 16 12:11:24.991724 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Dec 16 12:11:24.991735 systemd[1]: Mounting media.mount - External Media Directory... Dec 16 12:11:24.991746 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Dec 16 12:11:24.991757 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Dec 16 12:11:24.991770 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Dec 16 12:11:24.991781 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Dec 16 12:11:24.991791 systemd[1]: Reached target machines.target - Containers. Dec 16 12:11:24.991802 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Dec 16 12:11:24.991812 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 12:11:24.991823 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 16 12:11:24.991834 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Dec 16 12:11:24.991846 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 12:11:24.991857 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 16 12:11:24.991867 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 12:11:24.991882 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Dec 16 12:11:24.991892 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 12:11:24.991903 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Dec 16 12:11:24.991914 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Dec 16 12:11:24.991925 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Dec 16 12:11:24.991935 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Dec 16 12:11:24.991949 systemd[1]: Stopped systemd-fsck-usr.service. Dec 16 12:11:24.991961 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 12:11:24.991972 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 16 12:11:24.991983 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 16 12:11:24.991994 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 16 12:11:24.992006 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Dec 16 12:11:24.992017 kernel: ACPI: bus type drm_connector registered Dec 16 12:11:24.992027 kernel: fuse: init (API version 7.41) Dec 16 12:11:24.992039 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Dec 16 12:11:24.992049 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 16 12:11:24.992060 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Dec 16 12:11:24.992073 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Dec 16 12:11:24.992107 systemd-journald[1282]: Collecting audit messages is enabled. Dec 16 12:11:24.992133 systemd[1]: Mounted media.mount - External Media Directory. Dec 16 12:11:24.992145 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Dec 16 12:11:24.992157 systemd-journald[1282]: Journal started Dec 16 12:11:24.992180 systemd-journald[1282]: Runtime Journal (/run/log/journal/b8ab6db6e2474144b7213534fbb295d8) is 8M, max 319.5M, 311.5M free. Dec 16 12:11:24.942000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:24.944000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:24.947000 audit: BPF prog-id=14 op=UNLOAD Dec 16 12:11:24.947000 audit: BPF prog-id=13 op=UNLOAD Dec 16 12:11:24.948000 audit: BPF prog-id=15 op=LOAD Dec 16 12:11:24.948000 audit: BPF prog-id=16 op=LOAD Dec 16 12:11:24.948000 audit: BPF prog-id=17 op=LOAD Dec 16 12:11:24.988000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Dec 16 12:11:24.988000 audit[1282]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=60 a0=6 a1=ffffdfe386c0 a2=4000 a3=0 items=0 ppid=1 pid=1282 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:24.988000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Dec 16 12:11:24.767562 systemd[1]: Queued start job for default target multi-user.target. Dec 16 12:11:24.779586 systemd[1]: Unnecessary job was removed for dev-vda6.device - /dev/vda6. Dec 16 12:11:24.780051 systemd[1]: systemd-journald.service: Deactivated successfully. Dec 16 12:11:24.995382 systemd[1]: Started systemd-journald.service - Journal Service. Dec 16 12:11:24.994000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:24.996359 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Dec 16 12:11:24.997684 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Dec 16 12:11:25.003017 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 12:11:25.003000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.004899 systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 16 12:11:25.005672 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Dec 16 12:11:25.006000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.006000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.007107 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 12:11:25.007263 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 12:11:25.007000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.007000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.008622 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 16 12:11:25.009836 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 16 12:11:25.010000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.010000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.011035 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 12:11:25.011184 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 12:11:25.011000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.011000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.012894 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Dec 16 12:11:25.013000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.014165 systemd[1]: modprobe@fuse.service: Deactivated successfully. Dec 16 12:11:25.014335 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Dec 16 12:11:25.014000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.014000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.015653 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 12:11:25.015805 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 12:11:25.016000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.016000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.018673 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 16 12:11:25.019000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.020066 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 12:11:25.020000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.022287 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Dec 16 12:11:25.022000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.023940 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Dec 16 12:11:25.024000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.036863 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 16 12:11:25.038220 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Dec 16 12:11:25.040615 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Dec 16 12:11:25.042552 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Dec 16 12:11:25.043810 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Dec 16 12:11:25.043840 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 16 12:11:25.045580 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Dec 16 12:11:25.052420 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 12:11:25.052542 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Dec 16 12:11:25.054791 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Dec 16 12:11:25.057832 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Dec 16 12:11:25.059007 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 16 12:11:25.059982 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Dec 16 12:11:25.061028 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 16 12:11:25.063845 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 16 12:11:25.068804 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Dec 16 12:11:25.071172 systemd-journald[1282]: Time spent on flushing to /var/log/journal/b8ab6db6e2474144b7213534fbb295d8 is 23.831ms for 1812 entries. Dec 16 12:11:25.071172 systemd-journald[1282]: System Journal (/var/log/journal/b8ab6db6e2474144b7213534fbb295d8) is 8M, max 588.1M, 580.1M free. Dec 16 12:11:25.116424 systemd-journald[1282]: Received client request to flush runtime journal. Dec 16 12:11:25.116481 kernel: loop1: detected capacity change from 0 to 45344 Dec 16 12:11:25.087000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.088000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.097000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.072332 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 16 12:11:25.116000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.074976 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Dec 16 12:11:25.076905 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Dec 16 12:11:25.086684 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 12:11:25.088185 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Dec 16 12:11:25.090024 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Dec 16 12:11:25.092439 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Dec 16 12:11:25.096772 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 16 12:11:25.112785 systemd-tmpfiles[1334]: ACLs are not supported, ignoring. Dec 16 12:11:25.112796 systemd-tmpfiles[1334]: ACLs are not supported, ignoring. Dec 16 12:11:25.116034 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 12:11:25.117755 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Dec 16 12:11:25.119000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.122725 systemd[1]: Starting systemd-sysusers.service - Create System Users... Dec 16 12:11:25.133869 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Dec 16 12:11:25.135000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.162477 systemd[1]: Finished systemd-sysusers.service - Create System Users. Dec 16 12:11:25.163000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.164000 audit: BPF prog-id=18 op=LOAD Dec 16 12:11:25.164000 audit: BPF prog-id=19 op=LOAD Dec 16 12:11:25.164000 audit: BPF prog-id=20 op=LOAD Dec 16 12:11:25.166064 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Dec 16 12:11:25.167674 kernel: loop2: detected capacity change from 0 to 100192 Dec 16 12:11:25.168000 audit: BPF prog-id=21 op=LOAD Dec 16 12:11:25.170822 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 16 12:11:25.172951 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 16 12:11:25.174000 audit: BPF prog-id=22 op=LOAD Dec 16 12:11:25.174000 audit: BPF prog-id=23 op=LOAD Dec 16 12:11:25.174000 audit: BPF prog-id=24 op=LOAD Dec 16 12:11:25.177207 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Dec 16 12:11:25.178000 audit: BPF prog-id=25 op=LOAD Dec 16 12:11:25.188000 audit: BPF prog-id=26 op=LOAD Dec 16 12:11:25.188000 audit: BPF prog-id=27 op=LOAD Dec 16 12:11:25.189623 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Dec 16 12:11:25.200702 systemd-tmpfiles[1355]: ACLs are not supported, ignoring. Dec 16 12:11:25.201017 systemd-tmpfiles[1355]: ACLs are not supported, ignoring. Dec 16 12:11:25.204399 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 12:11:25.206000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.209656 kernel: loop3: detected capacity change from 0 to 1648 Dec 16 12:11:25.219005 systemd-nsresourced[1356]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Dec 16 12:11:25.220513 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Dec 16 12:11:25.221000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.225527 systemd[1]: Started systemd-userdbd.service - User Database Manager. Dec 16 12:11:25.227000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.244654 kernel: loop4: detected capacity change from 0 to 211168 Dec 16 12:11:25.284948 systemd-resolved[1354]: Positive Trust Anchors: Dec 16 12:11:25.284969 systemd-resolved[1354]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 16 12:11:25.284973 systemd-resolved[1354]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Dec 16 12:11:25.285008 systemd-resolved[1354]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 16 12:11:25.286459 systemd-oomd[1353]: No swap; memory pressure usage will be degraded Dec 16 12:11:25.288480 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Dec 16 12:11:25.289000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.294660 kernel: loop5: detected capacity change from 0 to 45344 Dec 16 12:11:25.296837 systemd-resolved[1354]: Using system hostname 'ci-4547-0-0-4-c6e23b3406'. Dec 16 12:11:25.298211 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 16 12:11:25.299000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.299707 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 16 12:11:25.308659 kernel: loop6: detected capacity change from 0 to 100192 Dec 16 12:11:25.323657 kernel: loop7: detected capacity change from 0 to 1648 Dec 16 12:11:25.329657 kernel: loop1: detected capacity change from 0 to 211168 Dec 16 12:11:25.343315 (sd-merge)[1377]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-stackit.raw'. Dec 16 12:11:25.346376 (sd-merge)[1377]: Merged extensions into '/usr'. Dec 16 12:11:25.350308 systemd[1]: Reload requested from client PID 1333 ('systemd-sysext') (unit systemd-sysext.service)... Dec 16 12:11:25.350329 systemd[1]: Reloading... Dec 16 12:11:25.401643 zram_generator::config[1407]: No configuration found. Dec 16 12:11:25.551373 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Dec 16 12:11:25.551478 systemd[1]: Reloading finished in 200 ms. Dec 16 12:11:25.583427 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Dec 16 12:11:25.584000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.598311 systemd[1]: Starting ensure-sysext.service... Dec 16 12:11:25.600246 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 16 12:11:25.601000 audit: BPF prog-id=28 op=LOAD Dec 16 12:11:25.601000 audit: BPF prog-id=21 op=UNLOAD Dec 16 12:11:25.601000 audit: BPF prog-id=29 op=LOAD Dec 16 12:11:25.601000 audit: BPF prog-id=25 op=UNLOAD Dec 16 12:11:25.601000 audit: BPF prog-id=30 op=LOAD Dec 16 12:11:25.601000 audit: BPF prog-id=31 op=LOAD Dec 16 12:11:25.601000 audit: BPF prog-id=26 op=UNLOAD Dec 16 12:11:25.602000 audit: BPF prog-id=27 op=UNLOAD Dec 16 12:11:25.602000 audit: BPF prog-id=32 op=LOAD Dec 16 12:11:25.602000 audit: BPF prog-id=22 op=UNLOAD Dec 16 12:11:25.602000 audit: BPF prog-id=33 op=LOAD Dec 16 12:11:25.602000 audit: BPF prog-id=34 op=LOAD Dec 16 12:11:25.602000 audit: BPF prog-id=23 op=UNLOAD Dec 16 12:11:25.603000 audit: BPF prog-id=24 op=UNLOAD Dec 16 12:11:25.603000 audit: BPF prog-id=35 op=LOAD Dec 16 12:11:25.603000 audit: BPF prog-id=18 op=UNLOAD Dec 16 12:11:25.603000 audit: BPF prog-id=36 op=LOAD Dec 16 12:11:25.603000 audit: BPF prog-id=37 op=LOAD Dec 16 12:11:25.603000 audit: BPF prog-id=19 op=UNLOAD Dec 16 12:11:25.603000 audit: BPF prog-id=20 op=UNLOAD Dec 16 12:11:25.604000 audit: BPF prog-id=38 op=LOAD Dec 16 12:11:25.604000 audit: BPF prog-id=15 op=UNLOAD Dec 16 12:11:25.604000 audit: BPF prog-id=39 op=LOAD Dec 16 12:11:25.604000 audit: BPF prog-id=40 op=LOAD Dec 16 12:11:25.604000 audit: BPF prog-id=16 op=UNLOAD Dec 16 12:11:25.604000 audit: BPF prog-id=17 op=UNLOAD Dec 16 12:11:25.607624 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Dec 16 12:11:25.609000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.609000 audit: BPF prog-id=8 op=UNLOAD Dec 16 12:11:25.609000 audit: BPF prog-id=7 op=UNLOAD Dec 16 12:11:25.610000 audit: BPF prog-id=41 op=LOAD Dec 16 12:11:25.610000 audit: BPF prog-id=42 op=LOAD Dec 16 12:11:25.612138 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 12:11:25.614839 systemd-tmpfiles[1444]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Dec 16 12:11:25.614868 systemd-tmpfiles[1444]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Dec 16 12:11:25.615415 systemd[1]: Reload requested from client PID 1443 ('systemctl') (unit ensure-sysext.service)... Dec 16 12:11:25.615435 systemd[1]: Reloading... Dec 16 12:11:25.616039 systemd-tmpfiles[1444]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Dec 16 12:11:25.617096 systemd-tmpfiles[1444]: ACLs are not supported, ignoring. Dec 16 12:11:25.617220 systemd-tmpfiles[1444]: ACLs are not supported, ignoring. Dec 16 12:11:25.625785 systemd-tmpfiles[1444]: Detected autofs mount point /boot during canonicalization of boot. Dec 16 12:11:25.625794 systemd-tmpfiles[1444]: Skipping /boot Dec 16 12:11:25.632155 systemd-tmpfiles[1444]: Detected autofs mount point /boot during canonicalization of boot. Dec 16 12:11:25.632282 systemd-tmpfiles[1444]: Skipping /boot Dec 16 12:11:25.643052 systemd-udevd[1447]: Using default interface naming scheme 'v257'. Dec 16 12:11:25.665660 zram_generator::config[1477]: No configuration found. Dec 16 12:11:25.759688 kernel: mousedev: PS/2 mouse device common for all mice Dec 16 12:11:25.837739 kernel: [drm] pci: virtio-gpu-pci detected at 0000:06:00.0 Dec 16 12:11:25.837829 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Dec 16 12:11:25.837847 kernel: [drm] features: -context_init Dec 16 12:11:25.839049 kernel: [drm] number of scanouts: 1 Dec 16 12:11:25.840135 kernel: [drm] number of cap sets: 0 Dec 16 12:11:25.843654 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:06:00.0 on minor 0 Dec 16 12:11:25.849667 kernel: Console: switching to colour frame buffer device 160x50 Dec 16 12:11:25.894210 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM. Dec 16 12:11:25.894688 kernel: virtio-pci 0000:06:00.0: [drm] fb0: virtio_gpudrmfb frame buffer device Dec 16 12:11:25.896060 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Dec 16 12:11:25.896731 systemd[1]: Reloading finished in 280 ms. Dec 16 12:11:25.917953 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 12:11:25.918000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.921000 audit: BPF prog-id=43 op=LOAD Dec 16 12:11:25.921000 audit: BPF prog-id=32 op=UNLOAD Dec 16 12:11:25.921000 audit: BPF prog-id=44 op=LOAD Dec 16 12:11:25.921000 audit: BPF prog-id=45 op=LOAD Dec 16 12:11:25.921000 audit: BPF prog-id=33 op=UNLOAD Dec 16 12:11:25.921000 audit: BPF prog-id=34 op=UNLOAD Dec 16 12:11:25.922000 audit: BPF prog-id=46 op=LOAD Dec 16 12:11:25.922000 audit: BPF prog-id=38 op=UNLOAD Dec 16 12:11:25.922000 audit: BPF prog-id=47 op=LOAD Dec 16 12:11:25.922000 audit: BPF prog-id=48 op=LOAD Dec 16 12:11:25.922000 audit: BPF prog-id=39 op=UNLOAD Dec 16 12:11:25.922000 audit: BPF prog-id=40 op=UNLOAD Dec 16 12:11:25.923000 audit: BPF prog-id=49 op=LOAD Dec 16 12:11:25.923000 audit: BPF prog-id=35 op=UNLOAD Dec 16 12:11:25.923000 audit: BPF prog-id=50 op=LOAD Dec 16 12:11:25.923000 audit: BPF prog-id=51 op=LOAD Dec 16 12:11:25.923000 audit: BPF prog-id=36 op=UNLOAD Dec 16 12:11:25.923000 audit: BPF prog-id=37 op=UNLOAD Dec 16 12:11:25.930000 audit: BPF prog-id=52 op=LOAD Dec 16 12:11:25.930000 audit: BPF prog-id=53 op=LOAD Dec 16 12:11:25.930000 audit: BPF prog-id=41 op=UNLOAD Dec 16 12:11:25.930000 audit: BPF prog-id=42 op=UNLOAD Dec 16 12:11:25.931000 audit: BPF prog-id=54 op=LOAD Dec 16 12:11:25.931000 audit: BPF prog-id=29 op=UNLOAD Dec 16 12:11:25.931000 audit: BPF prog-id=55 op=LOAD Dec 16 12:11:25.931000 audit: BPF prog-id=56 op=LOAD Dec 16 12:11:25.931000 audit: BPF prog-id=30 op=UNLOAD Dec 16 12:11:25.931000 audit: BPF prog-id=31 op=UNLOAD Dec 16 12:11:25.931000 audit: BPF prog-id=57 op=LOAD Dec 16 12:11:25.931000 audit: BPF prog-id=28 op=UNLOAD Dec 16 12:11:25.934419 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 12:11:25.935000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.960713 systemd[1]: Finished ensure-sysext.service. Dec 16 12:11:25.960000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:25.977057 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 12:11:25.979888 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Dec 16 12:11:25.981063 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 12:11:26.000859 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 12:11:26.003238 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 16 12:11:26.006452 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 12:11:26.009000 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 12:11:26.011845 systemd[1]: Starting modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm... Dec 16 12:11:26.013040 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 12:11:26.013156 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Dec 16 12:11:26.015722 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Dec 16 12:11:26.017704 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Dec 16 12:11:26.019212 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 12:11:26.020520 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Dec 16 12:11:26.022000 audit: BPF prog-id=58 op=LOAD Dec 16 12:11:26.025381 kernel: pps_core: LinuxPPS API ver. 1 registered Dec 16 12:11:26.025460 kernel: pps_core: Software ver. 5.3.6 - Copyright 2005-2007 Rodolfo Giometti Dec 16 12:11:26.025545 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 16 12:11:26.027818 systemd[1]: Reached target time-set.target - System Time Set. Dec 16 12:11:26.030704 kernel: PTP clock support registered Dec 16 12:11:26.031824 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Dec 16 12:11:26.037586 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:11:26.040811 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 12:11:26.041064 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 12:11:26.042740 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 16 12:11:26.041000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:26.043737 kernel: kauditd_printk_skb: 171 callbacks suppressed Dec 16 12:11:26.043792 kernel: audit: type=1130 audit(1765887086.041:217): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:26.046601 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 16 12:11:26.041000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:26.050263 kernel: audit: type=1131 audit(1765887086.041:218): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:26.051000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:26.052149 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 12:11:26.051000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:26.058180 kernel: audit: type=1130 audit(1765887086.051:219): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:26.058238 kernel: audit: type=1131 audit(1765887086.051:220): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:26.058896 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 12:11:26.059000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:26.061024 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 12:11:26.061649 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 12:11:26.064673 kernel: audit: type=1130 audit(1765887086.059:221): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:26.064728 kernel: audit: type=1131 audit(1765887086.059:222): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:26.064764 kernel: audit: type=1127 audit(1765887086.059:223): pid=1586 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Dec 16 12:11:26.059000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:26.059000 audit[1586]: SYSTEM_BOOT pid=1586 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Dec 16 12:11:26.070000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:26.072615 systemd[1]: modprobe@ptp_kvm.service: Deactivated successfully. Dec 16 12:11:26.072864 systemd[1]: Finished modprobe@ptp_kvm.service - Load Kernel Module ptp_kvm. Dec 16 12:11:26.070000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:26.077943 kernel: audit: type=1130 audit(1765887086.070:224): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:26.078024 kernel: audit: type=1131 audit(1765887086.070:225): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:26.077000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:26.080660 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Dec 16 12:11:26.086007 kernel: audit: type=1130 audit(1765887086.077:226): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:26.077000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@ptp_kvm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:26.086000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck@dev-disk-by\x2dlabel-OEM comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:26.087624 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Dec 16 12:11:26.088000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Dec 16 12:11:26.088000 audit[1600]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=fffffcc1e170 a2=420 a3=0 items=0 ppid=1565 pid=1600 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:26.088000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Dec 16 12:11:26.089010 augenrules[1600]: No rules Dec 16 12:11:26.089982 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 12:11:26.090233 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 12:11:26.099622 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 16 12:11:26.099885 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 16 12:11:26.107672 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Dec 16 12:11:26.135532 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:11:26.139689 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Dec 16 12:11:26.141882 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Dec 16 12:11:26.157421 systemd-networkd[1584]: lo: Link UP Dec 16 12:11:26.157430 systemd-networkd[1584]: lo: Gained carrier Dec 16 12:11:26.158649 systemd-networkd[1584]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:11:26.158659 systemd-networkd[1584]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 12:11:26.158746 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 16 12:11:26.160205 systemd[1]: Reached target network.target - Network. Dec 16 12:11:26.160613 systemd-networkd[1584]: eth0: Link UP Dec 16 12:11:26.161075 systemd-networkd[1584]: eth0: Gained carrier Dec 16 12:11:26.161101 systemd-networkd[1584]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:11:26.162789 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Dec 16 12:11:26.165063 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Dec 16 12:11:26.186720 systemd-networkd[1584]: eth0: DHCPv4 address 10.0.21.106/25, gateway 10.0.21.1 acquired from 10.0.21.1 Dec 16 12:11:26.191581 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Dec 16 12:11:26.544199 ldconfig[1578]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Dec 16 12:11:26.549458 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Dec 16 12:11:26.553333 systemd[1]: Starting systemd-update-done.service - Update is Completed... Dec 16 12:11:26.574268 systemd[1]: Finished systemd-update-done.service - Update is Completed. Dec 16 12:11:26.575640 systemd[1]: Reached target sysinit.target - System Initialization. Dec 16 12:11:26.576694 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Dec 16 12:11:26.577782 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Dec 16 12:11:26.579040 systemd[1]: Started logrotate.timer - Daily rotation of log files. Dec 16 12:11:26.580119 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Dec 16 12:11:26.581323 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Dec 16 12:11:26.582771 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Dec 16 12:11:26.583777 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Dec 16 12:11:26.584878 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Dec 16 12:11:26.584916 systemd[1]: Reached target paths.target - Path Units. Dec 16 12:11:26.585715 systemd[1]: Reached target timers.target - Timer Units. Dec 16 12:11:26.587299 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Dec 16 12:11:26.589684 systemd[1]: Starting docker.socket - Docker Socket for the API... Dec 16 12:11:26.592981 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Dec 16 12:11:26.594300 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Dec 16 12:11:26.595526 systemd[1]: Reached target ssh-access.target - SSH Access Available. Dec 16 12:11:26.602054 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Dec 16 12:11:26.603407 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Dec 16 12:11:26.605242 systemd[1]: Listening on docker.socket - Docker Socket for the API. Dec 16 12:11:26.606356 systemd[1]: Reached target sockets.target - Socket Units. Dec 16 12:11:26.607263 systemd[1]: Reached target basic.target - Basic System. Dec 16 12:11:26.608200 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Dec 16 12:11:26.608232 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Dec 16 12:11:26.610742 systemd[1]: Starting chronyd.service - NTP client/server... Dec 16 12:11:26.612476 systemd[1]: Starting containerd.service - containerd container runtime... Dec 16 12:11:26.614603 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Dec 16 12:11:26.616869 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Dec 16 12:11:26.620802 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Dec 16 12:11:26.622036 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:11:26.623987 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Dec 16 12:11:26.627841 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Dec 16 12:11:26.628783 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Dec 16 12:11:26.629858 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Dec 16 12:11:26.631756 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Dec 16 12:11:26.646741 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Dec 16 12:11:26.648064 chronyd[1628]: chronyd version 4.8 starting (+CMDMON +REFCLOCK +RTC +PRIVDROP +SCFILTER -SIGND +NTS +SECHASH +IPV6 -DEBUG) Dec 16 12:11:26.649597 chronyd[1628]: Loaded seccomp filter (level 2) Dec 16 12:11:26.650748 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Dec 16 12:11:26.652554 jq[1635]: false Dec 16 12:11:26.653890 systemd[1]: Starting systemd-logind.service - User Login Management... Dec 16 12:11:26.656833 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Dec 16 12:11:26.657275 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Dec 16 12:11:26.657841 systemd[1]: Starting update-engine.service - Update Engine... Dec 16 12:11:26.660641 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Dec 16 12:11:26.662500 systemd[1]: Started chronyd.service - NTP client/server. Dec 16 12:11:26.663615 extend-filesystems[1636]: Found /dev/vda6 Dec 16 12:11:26.666121 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Dec 16 12:11:26.667873 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Dec 16 12:11:26.668091 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Dec 16 12:11:26.669499 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Dec 16 12:11:26.671425 jq[1648]: true Dec 16 12:11:26.669717 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Dec 16 12:11:26.672062 extend-filesystems[1636]: Found /dev/vda9 Dec 16 12:11:26.676328 extend-filesystems[1636]: Checking size of /dev/vda9 Dec 16 12:11:26.682844 systemd[1]: motdgen.service: Deactivated successfully. Dec 16 12:11:26.683128 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Dec 16 12:11:26.686571 jq[1661]: true Dec 16 12:11:26.700065 update_engine[1647]: I20251216 12:11:26.699570 1647 main.cc:92] Flatcar Update Engine starting Dec 16 12:11:26.700928 extend-filesystems[1636]: Resized partition /dev/vda9 Dec 16 12:11:26.706890 extend-filesystems[1684]: resize2fs 1.47.3 (8-Jul-2025) Dec 16 12:11:26.711372 kernel: EXT4-fs (vda9): resizing filesystem from 1617920 to 11516923 blocks Dec 16 12:11:26.714673 tar[1655]: linux-arm64/LICENSE Dec 16 12:11:26.714673 tar[1655]: linux-arm64/helm Dec 16 12:11:26.720336 dbus-daemon[1631]: [system] SELinux support is enabled Dec 16 12:11:26.720616 systemd[1]: Started dbus.service - D-Bus System Message Bus. Dec 16 12:11:26.723570 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Dec 16 12:11:26.728619 update_engine[1647]: I20251216 12:11:26.728248 1647 update_check_scheduler.cc:74] Next update check in 9m28s Dec 16 12:11:26.723604 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Dec 16 12:11:26.726055 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Dec 16 12:11:26.726072 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Dec 16 12:11:26.728971 systemd[1]: Started update-engine.service - Update Engine. Dec 16 12:11:26.734937 systemd[1]: Started locksmithd.service - Cluster reboot manager. Dec 16 12:11:26.778767 systemd-logind[1644]: New seat seat0. Dec 16 12:11:26.819412 systemd-logind[1644]: Watching system buttons on /dev/input/event0 (Power Button) Dec 16 12:11:26.819494 systemd-logind[1644]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Dec 16 12:11:26.820835 systemd[1]: Started systemd-logind.service - User Login Management. Dec 16 12:11:26.823488 locksmithd[1690]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Dec 16 12:11:26.856449 bash[1706]: Updated "/home/core/.ssh/authorized_keys" Dec 16 12:11:26.860669 containerd[1662]: time="2025-12-16T12:11:26Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Dec 16 12:11:26.860142 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Dec 16 12:11:26.864663 containerd[1662]: time="2025-12-16T12:11:26.864591200Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Dec 16 12:11:26.875657 containerd[1662]: time="2025-12-16T12:11:26.874883800Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.24µs" Dec 16 12:11:26.875657 containerd[1662]: time="2025-12-16T12:11:26.874923840Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Dec 16 12:11:26.875657 containerd[1662]: time="2025-12-16T12:11:26.874967640Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Dec 16 12:11:26.875657 containerd[1662]: time="2025-12-16T12:11:26.874980440Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Dec 16 12:11:26.875657 containerd[1662]: time="2025-12-16T12:11:26.875124000Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Dec 16 12:11:26.875657 containerd[1662]: time="2025-12-16T12:11:26.875139160Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 16 12:11:26.875657 containerd[1662]: time="2025-12-16T12:11:26.875186360Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 16 12:11:26.875657 containerd[1662]: time="2025-12-16T12:11:26.875197160Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 16 12:11:26.875657 containerd[1662]: time="2025-12-16T12:11:26.875467840Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 16 12:11:26.875657 containerd[1662]: time="2025-12-16T12:11:26.875482520Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 16 12:11:26.875657 containerd[1662]: time="2025-12-16T12:11:26.875492640Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 16 12:11:26.875657 containerd[1662]: time="2025-12-16T12:11:26.875500280Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Dec 16 12:11:26.875995 containerd[1662]: time="2025-12-16T12:11:26.875974560Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Dec 16 12:11:26.876042 containerd[1662]: time="2025-12-16T12:11:26.876030240Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Dec 16 12:11:26.876176 containerd[1662]: time="2025-12-16T12:11:26.876157320Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Dec 16 12:11:26.876431 containerd[1662]: time="2025-12-16T12:11:26.876407680Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 16 12:11:26.876518 containerd[1662]: time="2025-12-16T12:11:26.876503200Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 16 12:11:26.876562 containerd[1662]: time="2025-12-16T12:11:26.876551800Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Dec 16 12:11:26.876694 containerd[1662]: time="2025-12-16T12:11:26.876626280Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Dec 16 12:11:26.878449 containerd[1662]: time="2025-12-16T12:11:26.878417640Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Dec 16 12:11:26.878610 containerd[1662]: time="2025-12-16T12:11:26.878593240Z" level=info msg="metadata content store policy set" policy=shared Dec 16 12:11:26.884155 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Dec 16 12:11:26.886977 systemd[1]: Starting sshkeys.service... Dec 16 12:11:26.901781 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Dec 16 12:11:26.904727 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Dec 16 12:11:26.912944 containerd[1662]: time="2025-12-16T12:11:26.912903720Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Dec 16 12:11:26.913121 containerd[1662]: time="2025-12-16T12:11:26.913103320Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Dec 16 12:11:26.913312 containerd[1662]: time="2025-12-16T12:11:26.913293680Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Dec 16 12:11:26.913380 containerd[1662]: time="2025-12-16T12:11:26.913366320Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Dec 16 12:11:26.913503 containerd[1662]: time="2025-12-16T12:11:26.913486960Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Dec 16 12:11:26.913574 containerd[1662]: time="2025-12-16T12:11:26.913561280Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Dec 16 12:11:26.913712 containerd[1662]: time="2025-12-16T12:11:26.913614320Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Dec 16 12:11:26.913804 containerd[1662]: time="2025-12-16T12:11:26.913787920Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Dec 16 12:11:26.913865 containerd[1662]: time="2025-12-16T12:11:26.913843240Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Dec 16 12:11:26.913961 containerd[1662]: time="2025-12-16T12:11:26.913904560Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Dec 16 12:11:26.914036 containerd[1662]: time="2025-12-16T12:11:26.914012240Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Dec 16 12:11:26.914091 containerd[1662]: time="2025-12-16T12:11:26.914079800Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Dec 16 12:11:26.914151 containerd[1662]: time="2025-12-16T12:11:26.914138840Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Dec 16 12:11:26.914203 containerd[1662]: time="2025-12-16T12:11:26.914193480Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Dec 16 12:11:26.914427 containerd[1662]: time="2025-12-16T12:11:26.914399440Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Dec 16 12:11:26.914550 containerd[1662]: time="2025-12-16T12:11:26.914532560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Dec 16 12:11:26.914622 containerd[1662]: time="2025-12-16T12:11:26.914609000Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Dec 16 12:11:26.914717 containerd[1662]: time="2025-12-16T12:11:26.914702080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Dec 16 12:11:26.914778 containerd[1662]: time="2025-12-16T12:11:26.914767080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Dec 16 12:11:26.915199 containerd[1662]: time="2025-12-16T12:11:26.914827560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Dec 16 12:11:26.915199 containerd[1662]: time="2025-12-16T12:11:26.914851760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Dec 16 12:11:26.915199 containerd[1662]: time="2025-12-16T12:11:26.914868320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Dec 16 12:11:26.915199 containerd[1662]: time="2025-12-16T12:11:26.914879720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Dec 16 12:11:26.915199 containerd[1662]: time="2025-12-16T12:11:26.914892000Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Dec 16 12:11:26.915199 containerd[1662]: time="2025-12-16T12:11:26.914903760Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Dec 16 12:11:26.915199 containerd[1662]: time="2025-12-16T12:11:26.914935320Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Dec 16 12:11:26.915199 containerd[1662]: time="2025-12-16T12:11:26.914972200Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Dec 16 12:11:26.915199 containerd[1662]: time="2025-12-16T12:11:26.914990000Z" level=info msg="Start snapshots syncer" Dec 16 12:11:26.915199 containerd[1662]: time="2025-12-16T12:11:26.915023640Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Dec 16 12:11:26.915704 containerd[1662]: time="2025-12-16T12:11:26.915666240Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Dec 16 12:11:26.916296 containerd[1662]: time="2025-12-16T12:11:26.916220320Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Dec 16 12:11:26.916907 containerd[1662]: time="2025-12-16T12:11:26.916798560Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Dec 16 12:11:26.917139 containerd[1662]: time="2025-12-16T12:11:26.917111640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Dec 16 12:11:26.917254 containerd[1662]: time="2025-12-16T12:11:26.917238320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Dec 16 12:11:26.917321 containerd[1662]: time="2025-12-16T12:11:26.917309840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Dec 16 12:11:26.917459 containerd[1662]: time="2025-12-16T12:11:26.917384040Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Dec 16 12:11:26.917459 containerd[1662]: time="2025-12-16T12:11:26.917438400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Dec 16 12:11:26.917550 containerd[1662]: time="2025-12-16T12:11:26.917535240Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Dec 16 12:11:26.917613 containerd[1662]: time="2025-12-16T12:11:26.917591760Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Dec 16 12:11:26.917786 containerd[1662]: time="2025-12-16T12:11:26.917681440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Dec 16 12:11:26.917786 containerd[1662]: time="2025-12-16T12:11:26.917709560Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Dec 16 12:11:26.917884 containerd[1662]: time="2025-12-16T12:11:26.917868120Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 16 12:11:26.918013 containerd[1662]: time="2025-12-16T12:11:26.917985480Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 16 12:11:26.918081 containerd[1662]: time="2025-12-16T12:11:26.918062960Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 16 12:11:26.918214 containerd[1662]: time="2025-12-16T12:11:26.918129680Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 16 12:11:26.918214 containerd[1662]: time="2025-12-16T12:11:26.918154040Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Dec 16 12:11:26.918214 containerd[1662]: time="2025-12-16T12:11:26.918165600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Dec 16 12:11:26.918423 containerd[1662]: time="2025-12-16T12:11:26.918175560Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Dec 16 12:11:26.918487 containerd[1662]: time="2025-12-16T12:11:26.918474360Z" level=info msg="runtime interface created" Dec 16 12:11:26.918542 containerd[1662]: time="2025-12-16T12:11:26.918524640Z" level=info msg="created NRI interface" Dec 16 12:11:26.918591 containerd[1662]: time="2025-12-16T12:11:26.918580080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Dec 16 12:11:26.918743 containerd[1662]: time="2025-12-16T12:11:26.918676560Z" level=info msg="Connect containerd service" Dec 16 12:11:26.918743 containerd[1662]: time="2025-12-16T12:11:26.918709520Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Dec 16 12:11:26.919939 containerd[1662]: time="2025-12-16T12:11:26.919915600Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 16 12:11:26.924699 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:11:27.027922 containerd[1662]: time="2025-12-16T12:11:27.027357800Z" level=info msg="Start subscribing containerd event" Dec 16 12:11:27.027922 containerd[1662]: time="2025-12-16T12:11:27.027440440Z" level=info msg="Start recovering state" Dec 16 12:11:27.027922 containerd[1662]: time="2025-12-16T12:11:27.027527400Z" level=info msg="Start event monitor" Dec 16 12:11:27.027922 containerd[1662]: time="2025-12-16T12:11:27.027539560Z" level=info msg="Start cni network conf syncer for default" Dec 16 12:11:27.027922 containerd[1662]: time="2025-12-16T12:11:27.027557480Z" level=info msg="Start streaming server" Dec 16 12:11:27.027922 containerd[1662]: time="2025-12-16T12:11:27.027582360Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Dec 16 12:11:27.027922 containerd[1662]: time="2025-12-16T12:11:27.027590000Z" level=info msg="runtime interface starting up..." Dec 16 12:11:27.027922 containerd[1662]: time="2025-12-16T12:11:27.027595040Z" level=info msg="starting plugins..." Dec 16 12:11:27.027922 containerd[1662]: time="2025-12-16T12:11:27.027609520Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Dec 16 12:11:27.028399 containerd[1662]: time="2025-12-16T12:11:27.028258120Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Dec 16 12:11:27.028399 containerd[1662]: time="2025-12-16T12:11:27.028325480Z" level=info msg=serving... address=/run/containerd/containerd.sock Dec 16 12:11:27.028399 containerd[1662]: time="2025-12-16T12:11:27.028380200Z" level=info msg="containerd successfully booted in 0.169092s" Dec 16 12:11:27.028678 systemd[1]: Started containerd.service - containerd container runtime. Dec 16 12:11:27.049653 kernel: EXT4-fs (vda9): resized filesystem to 11516923 Dec 16 12:11:27.070694 extend-filesystems[1684]: Filesystem at /dev/vda9 is mounted on /; on-line resizing required Dec 16 12:11:27.070694 extend-filesystems[1684]: old_desc_blocks = 1, new_desc_blocks = 6 Dec 16 12:11:27.070694 extend-filesystems[1684]: The filesystem on /dev/vda9 is now 11516923 (4k) blocks long. Dec 16 12:11:27.076830 extend-filesystems[1636]: Resized filesystem in /dev/vda9 Dec 16 12:11:27.072068 systemd[1]: extend-filesystems.service: Deactivated successfully. Dec 16 12:11:27.072716 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Dec 16 12:11:27.146967 tar[1655]: linux-arm64/README.md Dec 16 12:11:27.163836 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Dec 16 12:11:27.634694 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:11:27.937653 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:11:28.056773 systemd-networkd[1584]: eth0: Gained IPv6LL Dec 16 12:11:28.058334 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Dec 16 12:11:28.060625 systemd[1]: Reached target network-online.target - Network is Online. Dec 16 12:11:28.063096 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:11:28.065277 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Dec 16 12:11:28.097740 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Dec 16 12:11:28.966137 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:11:28.969831 (kubelet)[1759]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:11:29.310973 sshd_keygen[1654]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Dec 16 12:11:29.329960 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Dec 16 12:11:29.332816 systemd[1]: Starting issuegen.service - Generate /run/issue... Dec 16 12:11:29.334883 systemd[1]: Started sshd@0-10.0.21.106:22-139.178.68.195:33432.service - OpenSSH per-connection server daemon (139.178.68.195:33432). Dec 16 12:11:29.358881 systemd[1]: issuegen.service: Deactivated successfully. Dec 16 12:11:29.359148 systemd[1]: Finished issuegen.service - Generate /run/issue. Dec 16 12:11:29.362728 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Dec 16 12:11:29.389827 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Dec 16 12:11:29.393248 systemd[1]: Started getty@tty1.service - Getty on tty1. Dec 16 12:11:29.397512 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Dec 16 12:11:29.399033 systemd[1]: Reached target getty.target - Login Prompts. Dec 16 12:11:29.641698 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:11:29.686680 kubelet[1759]: E1216 12:11:29.686604 1759 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:11:29.688999 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:11:29.689133 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:11:29.689580 systemd[1]: kubelet.service: Consumed 810ms CPU time, 259.4M memory peak. Dec 16 12:11:29.945657 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:11:30.236479 sshd[1774]: Accepted publickey for core from 139.178.68.195 port 33432 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:11:30.240055 sshd-session[1774]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:11:30.247949 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Dec 16 12:11:30.250439 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Dec 16 12:11:30.256831 systemd-logind[1644]: New session 1 of user core. Dec 16 12:11:30.277716 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Dec 16 12:11:30.281426 systemd[1]: Starting user@500.service - User Manager for UID 500... Dec 16 12:11:30.307009 (systemd)[1791]: pam_unix(systemd-user:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:11:30.309803 systemd-logind[1644]: New session 2 of user core. Dec 16 12:11:30.419660 systemd[1791]: Queued start job for default target default.target. Dec 16 12:11:30.442915 systemd[1791]: Created slice app.slice - User Application Slice. Dec 16 12:11:30.442950 systemd[1791]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Dec 16 12:11:30.442962 systemd[1791]: Reached target paths.target - Paths. Dec 16 12:11:30.443009 systemd[1791]: Reached target timers.target - Timers. Dec 16 12:11:30.444212 systemd[1791]: Starting dbus.socket - D-Bus User Message Bus Socket... Dec 16 12:11:30.444935 systemd[1791]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Dec 16 12:11:30.454352 systemd[1791]: Listening on dbus.socket - D-Bus User Message Bus Socket. Dec 16 12:11:30.454642 systemd[1791]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Dec 16 12:11:30.454804 systemd[1791]: Reached target sockets.target - Sockets. Dec 16 12:11:30.454849 systemd[1791]: Reached target basic.target - Basic System. Dec 16 12:11:30.454877 systemd[1791]: Reached target default.target - Main User Target. Dec 16 12:11:30.454902 systemd[1791]: Startup finished in 139ms. Dec 16 12:11:30.455159 systemd[1]: Started user@500.service - User Manager for UID 500. Dec 16 12:11:30.465017 systemd[1]: Started session-1.scope - Session 1 of User core. Dec 16 12:11:30.983403 systemd[1]: Started sshd@1-10.0.21.106:22-139.178.68.195:37000.service - OpenSSH per-connection server daemon (139.178.68.195:37000). Dec 16 12:11:31.871695 sshd[1805]: Accepted publickey for core from 139.178.68.195 port 37000 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:11:31.873016 sshd-session[1805]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:11:31.876582 systemd-logind[1644]: New session 3 of user core. Dec 16 12:11:31.885100 systemd[1]: Started session-3.scope - Session 3 of User core. Dec 16 12:11:32.377408 sshd[1809]: Connection closed by 139.178.68.195 port 37000 Dec 16 12:11:32.377828 sshd-session[1805]: pam_unix(sshd:session): session closed for user core Dec 16 12:11:32.381829 systemd[1]: sshd@1-10.0.21.106:22-139.178.68.195:37000.service: Deactivated successfully. Dec 16 12:11:32.384264 systemd[1]: session-3.scope: Deactivated successfully. Dec 16 12:11:32.385342 systemd-logind[1644]: Session 3 logged out. Waiting for processes to exit. Dec 16 12:11:32.386190 systemd-logind[1644]: Removed session 3. Dec 16 12:11:32.550851 systemd[1]: Started sshd@2-10.0.21.106:22-139.178.68.195:37012.service - OpenSSH per-connection server daemon (139.178.68.195:37012). Dec 16 12:11:33.403090 sshd[1815]: Accepted publickey for core from 139.178.68.195 port 37012 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:11:33.404603 sshd-session[1815]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:11:33.408468 systemd-logind[1644]: New session 4 of user core. Dec 16 12:11:33.418899 systemd[1]: Started session-4.scope - Session 4 of User core. Dec 16 12:11:33.654659 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:11:33.660718 coreos-metadata[1630]: Dec 16 12:11:33.660 WARN failed to locate config-drive, using the metadata service API instead Dec 16 12:11:33.679425 coreos-metadata[1630]: Dec 16 12:11:33.679 INFO Fetching http://169.254.169.254/openstack/2012-08-10/meta_data.json: Attempt #1 Dec 16 12:11:33.896666 sshd[1819]: Connection closed by 139.178.68.195 port 37012 Dec 16 12:11:33.896749 sshd-session[1815]: pam_unix(sshd:session): session closed for user core Dec 16 12:11:33.900579 systemd[1]: sshd@2-10.0.21.106:22-139.178.68.195:37012.service: Deactivated successfully. Dec 16 12:11:33.902271 systemd[1]: session-4.scope: Deactivated successfully. Dec 16 12:11:33.904410 systemd-logind[1644]: Session 4 logged out. Waiting for processes to exit. Dec 16 12:11:33.905683 systemd-logind[1644]: Removed session 4. Dec 16 12:11:33.952675 kernel: /dev/disk/by-label/config-2: Can't lookup blockdev Dec 16 12:11:33.959105 coreos-metadata[1717]: Dec 16 12:11:33.959 WARN failed to locate config-drive, using the metadata service API instead Dec 16 12:11:33.971976 coreos-metadata[1717]: Dec 16 12:11:33.971 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys: Attempt #1 Dec 16 12:11:34.054981 coreos-metadata[1630]: Dec 16 12:11:34.054 INFO Fetch successful Dec 16 12:11:34.055257 coreos-metadata[1630]: Dec 16 12:11:34.055 INFO Fetching http://169.254.169.254/latest/meta-data/hostname: Attempt #1 Dec 16 12:11:34.206238 coreos-metadata[1717]: Dec 16 12:11:34.206 INFO Fetch successful Dec 16 12:11:34.206238 coreos-metadata[1717]: Dec 16 12:11:34.206 INFO Fetching http://169.254.169.254/latest/meta-data/public-keys/0/openssh-key: Attempt #1 Dec 16 12:11:34.288929 coreos-metadata[1630]: Dec 16 12:11:34.288 INFO Fetch successful Dec 16 12:11:34.288929 coreos-metadata[1630]: Dec 16 12:11:34.288 INFO Fetching http://169.254.169.254/latest/meta-data/instance-id: Attempt #1 Dec 16 12:11:34.440772 coreos-metadata[1717]: Dec 16 12:11:34.440 INFO Fetch successful Dec 16 12:11:34.442925 unknown[1717]: wrote ssh authorized keys file for user: core Dec 16 12:11:34.448479 coreos-metadata[1630]: Dec 16 12:11:34.448 INFO Fetch successful Dec 16 12:11:34.448479 coreos-metadata[1630]: Dec 16 12:11:34.448 INFO Fetching http://169.254.169.254/latest/meta-data/instance-type: Attempt #1 Dec 16 12:11:34.475769 update-ssh-keys[1829]: Updated "/home/core/.ssh/authorized_keys" Dec 16 12:11:34.476744 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Dec 16 12:11:34.478379 systemd[1]: Finished sshkeys.service. Dec 16 12:11:34.572905 coreos-metadata[1630]: Dec 16 12:11:34.572 INFO Fetch successful Dec 16 12:11:34.573143 coreos-metadata[1630]: Dec 16 12:11:34.573 INFO Fetching http://169.254.169.254/latest/meta-data/local-ipv4: Attempt #1 Dec 16 12:11:34.698387 coreos-metadata[1630]: Dec 16 12:11:34.698 INFO Fetch successful Dec 16 12:11:34.698854 coreos-metadata[1630]: Dec 16 12:11:34.698 INFO Fetching http://169.254.169.254/latest/meta-data/public-ipv4: Attempt #1 Dec 16 12:11:34.826255 coreos-metadata[1630]: Dec 16 12:11:34.826 INFO Fetch successful Dec 16 12:11:34.855526 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Dec 16 12:11:34.856142 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Dec 16 12:11:34.856275 systemd[1]: Reached target multi-user.target - Multi-User System. Dec 16 12:11:34.856967 systemd[1]: Startup finished in 2.397s (kernel) + 14.129s (initrd) + 10.657s (userspace) = 27.184s. Dec 16 12:11:39.852739 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Dec 16 12:11:39.854277 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:11:39.991387 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:11:39.995444 (kubelet)[1849]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:11:40.034336 kubelet[1849]: E1216 12:11:40.034290 1849 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:11:40.037856 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:11:40.037980 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:11:40.038494 systemd[1]: kubelet.service: Consumed 141ms CPU time, 107.9M memory peak. Dec 16 12:11:44.080364 systemd[1]: Started sshd@3-10.0.21.106:22-139.178.68.195:53450.service - OpenSSH per-connection server daemon (139.178.68.195:53450). Dec 16 12:11:44.950657 sshd[1859]: Accepted publickey for core from 139.178.68.195 port 53450 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:11:44.952419 sshd-session[1859]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:11:44.956673 systemd-logind[1644]: New session 5 of user core. Dec 16 12:11:44.972012 systemd[1]: Started session-5.scope - Session 5 of User core. Dec 16 12:11:45.446560 sshd[1863]: Connection closed by 139.178.68.195 port 53450 Dec 16 12:11:45.447145 sshd-session[1859]: pam_unix(sshd:session): session closed for user core Dec 16 12:11:45.450958 systemd[1]: sshd@3-10.0.21.106:22-139.178.68.195:53450.service: Deactivated successfully. Dec 16 12:11:45.452487 systemd[1]: session-5.scope: Deactivated successfully. Dec 16 12:11:45.454245 systemd-logind[1644]: Session 5 logged out. Waiting for processes to exit. Dec 16 12:11:45.455270 systemd-logind[1644]: Removed session 5. Dec 16 12:11:45.632228 systemd[1]: Started sshd@4-10.0.21.106:22-139.178.68.195:53456.service - OpenSSH per-connection server daemon (139.178.68.195:53456). Dec 16 12:11:46.496339 sshd[1869]: Accepted publickey for core from 139.178.68.195 port 53456 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:11:46.497716 sshd-session[1869]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:11:46.501519 systemd-logind[1644]: New session 6 of user core. Dec 16 12:11:46.516907 systemd[1]: Started session-6.scope - Session 6 of User core. Dec 16 12:11:46.993398 sshd[1873]: Connection closed by 139.178.68.195 port 53456 Dec 16 12:11:46.993940 sshd-session[1869]: pam_unix(sshd:session): session closed for user core Dec 16 12:11:46.997655 systemd-logind[1644]: Session 6 logged out. Waiting for processes to exit. Dec 16 12:11:46.997795 systemd[1]: sshd@4-10.0.21.106:22-139.178.68.195:53456.service: Deactivated successfully. Dec 16 12:11:46.999289 systemd[1]: session-6.scope: Deactivated successfully. Dec 16 12:11:47.000692 systemd-logind[1644]: Removed session 6. Dec 16 12:11:47.188951 systemd[1]: Started sshd@5-10.0.21.106:22-139.178.68.195:53468.service - OpenSSH per-connection server daemon (139.178.68.195:53468). Dec 16 12:11:48.116675 sshd[1879]: Accepted publickey for core from 139.178.68.195 port 53468 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:11:48.117998 sshd-session[1879]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:11:48.122763 systemd-logind[1644]: New session 7 of user core. Dec 16 12:11:48.132859 systemd[1]: Started session-7.scope - Session 7 of User core. Dec 16 12:11:48.643483 sshd[1883]: Connection closed by 139.178.68.195 port 53468 Dec 16 12:11:48.643925 sshd-session[1879]: pam_unix(sshd:session): session closed for user core Dec 16 12:11:48.647181 systemd[1]: sshd@5-10.0.21.106:22-139.178.68.195:53468.service: Deactivated successfully. Dec 16 12:11:48.648765 systemd[1]: session-7.scope: Deactivated successfully. Dec 16 12:11:48.652105 systemd-logind[1644]: Session 7 logged out. Waiting for processes to exit. Dec 16 12:11:48.653073 systemd-logind[1644]: Removed session 7. Dec 16 12:11:48.819036 systemd[1]: Started sshd@6-10.0.21.106:22-139.178.68.195:53478.service - OpenSSH per-connection server daemon (139.178.68.195:53478). Dec 16 12:11:49.694001 sshd[1889]: Accepted publickey for core from 139.178.68.195 port 53478 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:11:49.695250 sshd-session[1889]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:11:49.698934 systemd-logind[1644]: New session 8 of user core. Dec 16 12:11:49.712786 systemd[1]: Started session-8.scope - Session 8 of User core. Dec 16 12:11:50.048716 sudo[1894]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Dec 16 12:11:50.048977 sudo[1894]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:11:50.064385 sudo[1894]: pam_unix(sudo:session): session closed for user root Dec 16 12:11:50.102747 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Dec 16 12:11:50.104449 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:11:50.232692 sshd[1893]: Connection closed by 139.178.68.195 port 53478 Dec 16 12:11:50.232598 sshd-session[1889]: pam_unix(sshd:session): session closed for user core Dec 16 12:11:50.236595 systemd-logind[1644]: Session 8 logged out. Waiting for processes to exit. Dec 16 12:11:50.236731 systemd[1]: sshd@6-10.0.21.106:22-139.178.68.195:53478.service: Deactivated successfully. Dec 16 12:11:50.241091 systemd[1]: session-8.scope: Deactivated successfully. Dec 16 12:11:50.243024 systemd-logind[1644]: Removed session 8. Dec 16 12:11:50.262315 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:11:50.266413 (kubelet)[1908]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:11:50.298118 kubelet[1908]: E1216 12:11:50.298067 1908 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:11:50.300835 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:11:50.300965 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:11:50.301525 systemd[1]: kubelet.service: Consumed 137ms CPU time, 107.5M memory peak. Dec 16 12:11:50.414168 systemd[1]: Started sshd@7-10.0.21.106:22-139.178.68.195:53480.service - OpenSSH per-connection server daemon (139.178.68.195:53480). Dec 16 12:11:50.434429 chronyd[1628]: Selected source PHC0 Dec 16 12:11:51.232330 sshd[1917]: Accepted publickey for core from 139.178.68.195 port 53480 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:11:51.233562 sshd-session[1917]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:11:51.237033 systemd-logind[1644]: New session 9 of user core. Dec 16 12:11:51.251239 systemd[1]: Started session-9.scope - Session 9 of User core. Dec 16 12:11:51.542158 sudo[1923]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Dec 16 12:11:51.542393 sudo[1923]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:11:51.545220 sudo[1923]: pam_unix(sudo:session): session closed for user root Dec 16 12:11:51.550116 sudo[1922]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Dec 16 12:11:51.550346 sudo[1922]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:11:51.556215 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 12:11:51.589000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Dec 16 12:11:51.590712 augenrules[1947]: No rules Dec 16 12:11:51.591050 kernel: kauditd_printk_skb: 5 callbacks suppressed Dec 16 12:11:51.591083 kernel: audit: type=1305 audit(1765887111.589:230): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Dec 16 12:11:51.591955 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 12:11:51.592197 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 12:11:51.589000 audit[1947]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=fffff7929820 a2=420 a3=0 items=0 ppid=1928 pid=1947 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:51.593542 sudo[1922]: pam_unix(sudo:session): session closed for user root Dec 16 12:11:51.595946 kernel: audit: type=1300 audit(1765887111.589:230): arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=fffff7929820 a2=420 a3=0 items=0 ppid=1928 pid=1947 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:51.596079 kernel: audit: type=1327 audit(1765887111.589:230): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Dec 16 12:11:51.589000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Dec 16 12:11:51.590000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:51.599691 kernel: audit: type=1130 audit(1765887111.590:231): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:51.599721 kernel: audit: type=1131 audit(1765887111.590:232): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:51.590000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:51.591000 audit[1922]: USER_END pid=1922 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:11:51.604208 kernel: audit: type=1106 audit(1765887111.591:233): pid=1922 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:11:51.604238 kernel: audit: type=1104 audit(1765887111.593:234): pid=1922 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:11:51.593000 audit[1922]: CRED_DISP pid=1922 uid=500 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:11:51.746241 sshd[1921]: Connection closed by 139.178.68.195 port 53480 Dec 16 12:11:51.746158 sshd-session[1917]: pam_unix(sshd:session): session closed for user core Dec 16 12:11:51.745000 audit[1917]: USER_END pid=1917 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:11:51.749747 systemd[1]: sshd@7-10.0.21.106:22-139.178.68.195:53480.service: Deactivated successfully. Dec 16 12:11:51.751141 systemd[1]: session-9.scope: Deactivated successfully. Dec 16 12:11:51.745000 audit[1917]: CRED_DISP pid=1917 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:11:51.754660 kernel: audit: type=1106 audit(1765887111.745:235): pid=1917 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:11:51.754761 kernel: audit: type=1104 audit(1765887111.745:236): pid=1917 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:11:51.754815 kernel: audit: type=1131 audit(1765887111.748:237): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.21.106:22-139.178.68.195:53480 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:51.748000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-10.0.21.106:22-139.178.68.195:53480 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:51.757419 systemd-logind[1644]: Session 9 logged out. Waiting for processes to exit. Dec 16 12:11:51.758342 systemd-logind[1644]: Removed session 9. Dec 16 12:11:51.912558 systemd[1]: Started sshd@8-10.0.21.106:22-139.178.68.195:36766.service - OpenSSH per-connection server daemon (139.178.68.195:36766). Dec 16 12:11:51.912000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.21.106:22-139.178.68.195:36766 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:52.711000 audit[1956]: USER_ACCT pid=1956 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:11:52.713659 sshd[1956]: Accepted publickey for core from 139.178.68.195 port 36766 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:11:52.712000 audit[1956]: CRED_ACQ pid=1956 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:11:52.712000 audit[1956]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd78ef630 a2=3 a3=0 items=0 ppid=1 pid=1956 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:52.712000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:11:52.714896 sshd-session[1956]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:11:52.718341 systemd-logind[1644]: New session 10 of user core. Dec 16 12:11:52.727791 systemd[1]: Started session-10.scope - Session 10 of User core. Dec 16 12:11:52.728000 audit[1956]: USER_START pid=1956 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:11:52.730000 audit[1960]: CRED_ACQ pid=1960 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:11:53.020000 audit[1961]: USER_ACCT pid=1961 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:11:53.020000 audit[1961]: CRED_REFR pid=1961 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:11:53.022109 sudo[1961]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Dec 16 12:11:53.020000 audit[1961]: USER_START pid=1961 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:11:53.022340 sudo[1961]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:11:53.311348 systemd[1]: Starting docker.service - Docker Application Container Engine... Dec 16 12:11:53.324115 (dockerd)[1982]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Dec 16 12:11:53.542244 dockerd[1982]: time="2025-12-16T12:11:53.542195356Z" level=info msg="Starting up" Dec 16 12:11:53.543075 dockerd[1982]: time="2025-12-16T12:11:53.543030405Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Dec 16 12:11:53.552498 dockerd[1982]: time="2025-12-16T12:11:53.552462784Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Dec 16 12:11:53.588660 dockerd[1982]: time="2025-12-16T12:11:53.588341218Z" level=info msg="Loading containers: start." Dec 16 12:11:53.598640 kernel: Initializing XFRM netlink socket Dec 16 12:11:53.642000 audit[2032]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=2032 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:11:53.642000 audit[2032]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=fffff2ec8f70 a2=0 a3=0 items=0 ppid=1982 pid=2032 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.642000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Dec 16 12:11:53.644000 audit[2034]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=2034 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:11:53.644000 audit[2034]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=ffffc959c8f0 a2=0 a3=0 items=0 ppid=1982 pid=2034 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.644000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Dec 16 12:11:53.646000 audit[2036]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=2036 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:11:53.646000 audit[2036]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff02e3680 a2=0 a3=0 items=0 ppid=1982 pid=2036 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.646000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Dec 16 12:11:53.646000 audit[2038]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=2038 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:11:53.646000 audit[2038]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd76a9430 a2=0 a3=0 items=0 ppid=1982 pid=2038 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.646000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Dec 16 12:11:53.648000 audit[2040]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=2040 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:11:53.648000 audit[2040]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffeb6ad430 a2=0 a3=0 items=0 ppid=1982 pid=2040 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.648000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Dec 16 12:11:53.649000 audit[2042]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=2042 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:11:53.649000 audit[2042]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffd6f7e8e0 a2=0 a3=0 items=0 ppid=1982 pid=2042 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.649000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 12:11:53.652000 audit[2044]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=2044 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:11:53.652000 audit[2044]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffd83131d0 a2=0 a3=0 items=0 ppid=1982 pid=2044 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.652000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Dec 16 12:11:53.654000 audit[2046]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=2046 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:11:53.654000 audit[2046]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=ffffd9726c30 a2=0 a3=0 items=0 ppid=1982 pid=2046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.654000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Dec 16 12:11:53.691000 audit[2049]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=2049 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:11:53.691000 audit[2049]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=472 a0=3 a1=ffffd0342860 a2=0 a3=0 items=0 ppid=1982 pid=2049 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.691000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Dec 16 12:11:53.692000 audit[2051]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=2051 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:11:53.692000 audit[2051]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffedcfa170 a2=0 a3=0 items=0 ppid=1982 pid=2051 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.692000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Dec 16 12:11:53.695000 audit[2053]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=2053 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:11:53.695000 audit[2053]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=ffffcba308b0 a2=0 a3=0 items=0 ppid=1982 pid=2053 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.695000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Dec 16 12:11:53.698000 audit[2055]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=2055 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:11:53.698000 audit[2055]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=fffffd82cc00 a2=0 a3=0 items=0 ppid=1982 pid=2055 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.698000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 12:11:53.699000 audit[2057]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=2057 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:11:53.699000 audit[2057]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=ffffefd6aa90 a2=0 a3=0 items=0 ppid=1982 pid=2057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.699000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Dec 16 12:11:53.730000 audit[2087]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2087 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:11:53.730000 audit[2087]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=ffffed209490 a2=0 a3=0 items=0 ppid=1982 pid=2087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.730000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Dec 16 12:11:53.732000 audit[2089]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2089 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:11:53.732000 audit[2089]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=ffffd6b31760 a2=0 a3=0 items=0 ppid=1982 pid=2089 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.732000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Dec 16 12:11:53.734000 audit[2091]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2091 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:11:53.734000 audit[2091]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff1e95c20 a2=0 a3=0 items=0 ppid=1982 pid=2091 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.734000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Dec 16 12:11:53.734000 audit[2093]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2093 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:11:53.734000 audit[2093]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd39406f0 a2=0 a3=0 items=0 ppid=1982 pid=2093 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.734000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Dec 16 12:11:53.736000 audit[2095]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2095 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:11:53.736000 audit[2095]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffc02270b0 a2=0 a3=0 items=0 ppid=1982 pid=2095 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.736000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Dec 16 12:11:53.737000 audit[2097]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2097 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:11:53.737000 audit[2097]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffd8b952f0 a2=0 a3=0 items=0 ppid=1982 pid=2097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.737000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 12:11:53.740000 audit[2099]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2099 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:11:53.740000 audit[2099]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=fffff3939220 a2=0 a3=0 items=0 ppid=1982 pid=2099 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.740000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Dec 16 12:11:53.742000 audit[2101]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2101 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:11:53.742000 audit[2101]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=ffffc5a661e0 a2=0 a3=0 items=0 ppid=1982 pid=2101 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.742000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Dec 16 12:11:53.744000 audit[2103]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2103 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:11:53.744000 audit[2103]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=484 a0=3 a1=ffffc4c34310 a2=0 a3=0 items=0 ppid=1982 pid=2103 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.744000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Dec 16 12:11:53.745000 audit[2105]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2105 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:11:53.745000 audit[2105]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffe9993ba0 a2=0 a3=0 items=0 ppid=1982 pid=2105 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.745000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Dec 16 12:11:53.746000 audit[2107]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2107 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:11:53.746000 audit[2107]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=ffffe951d910 a2=0 a3=0 items=0 ppid=1982 pid=2107 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.746000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Dec 16 12:11:53.748000 audit[2109]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2109 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:11:53.748000 audit[2109]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=ffffc3f4cc00 a2=0 a3=0 items=0 ppid=1982 pid=2109 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.748000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 12:11:53.750000 audit[2111]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2111 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:11:53.750000 audit[2111]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=ffffedf1dfc0 a2=0 a3=0 items=0 ppid=1982 pid=2111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.750000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Dec 16 12:11:53.755000 audit[2116]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2116 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:11:53.755000 audit[2116]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=fffffbb50e90 a2=0 a3=0 items=0 ppid=1982 pid=2116 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.755000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Dec 16 12:11:53.756000 audit[2118]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2118 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:11:53.756000 audit[2118]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=ffffeff3f3d0 a2=0 a3=0 items=0 ppid=1982 pid=2118 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.756000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Dec 16 12:11:53.757000 audit[2120]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2120 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:11:53.757000 audit[2120]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffdacc3cc0 a2=0 a3=0 items=0 ppid=1982 pid=2120 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.757000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Dec 16 12:11:53.759000 audit[2122]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2122 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:11:53.759000 audit[2122]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffecae7070 a2=0 a3=0 items=0 ppid=1982 pid=2122 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.759000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Dec 16 12:11:53.761000 audit[2124]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2124 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:11:53.761000 audit[2124]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=fffffe6ad850 a2=0 a3=0 items=0 ppid=1982 pid=2124 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.761000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Dec 16 12:11:53.763000 audit[2126]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2126 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:11:53.763000 audit[2126]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=fffff74283e0 a2=0 a3=0 items=0 ppid=1982 pid=2126 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.763000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Dec 16 12:11:53.778000 audit[2132]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2132 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:11:53.778000 audit[2132]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=520 a0=3 a1=ffffcabafaa0 a2=0 a3=0 items=0 ppid=1982 pid=2132 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.778000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Dec 16 12:11:53.780000 audit[2134]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2134 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:11:53.780000 audit[2134]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=288 a0=3 a1=ffffc2fccfb0 a2=0 a3=0 items=0 ppid=1982 pid=2134 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.780000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Dec 16 12:11:53.788000 audit[2142]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2142 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:11:53.788000 audit[2142]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=300 a0=3 a1=ffffde0e90e0 a2=0 a3=0 items=0 ppid=1982 pid=2142 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.788000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Dec 16 12:11:53.796000 audit[2148]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2148 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:11:53.796000 audit[2148]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=376 a0=3 a1=ffffe429c860 a2=0 a3=0 items=0 ppid=1982 pid=2148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.796000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Dec 16 12:11:53.798000 audit[2150]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2150 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:11:53.798000 audit[2150]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=512 a0=3 a1=ffffc285e200 a2=0 a3=0 items=0 ppid=1982 pid=2150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.798000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Dec 16 12:11:53.800000 audit[2152]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2152 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:11:53.800000 audit[2152]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=fffff1af76a0 a2=0 a3=0 items=0 ppid=1982 pid=2152 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.800000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Dec 16 12:11:53.801000 audit[2154]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2154 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:11:53.801000 audit[2154]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=428 a0=3 a1=ffffec20eaa0 a2=0 a3=0 items=0 ppid=1982 pid=2154 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.801000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Dec 16 12:11:53.802000 audit[2156]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2156 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:11:53.802000 audit[2156]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffd91c2ee0 a2=0 a3=0 items=0 ppid=1982 pid=2156 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:11:53.802000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Dec 16 12:11:53.804949 systemd-networkd[1584]: docker0: Link UP Dec 16 12:11:53.809539 dockerd[1982]: time="2025-12-16T12:11:53.809499144Z" level=info msg="Loading containers: done." Dec 16 12:11:53.820180 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck2010358848-merged.mount: Deactivated successfully. Dec 16 12:11:53.831195 dockerd[1982]: time="2025-12-16T12:11:53.831070790Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Dec 16 12:11:53.831356 dockerd[1982]: time="2025-12-16T12:11:53.831214707Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Dec 16 12:11:53.831402 dockerd[1982]: time="2025-12-16T12:11:53.831380991Z" level=info msg="Initializing buildkit" Dec 16 12:11:53.851457 dockerd[1982]: time="2025-12-16T12:11:53.851347844Z" level=info msg="Completed buildkit initialization" Dec 16 12:11:53.857426 dockerd[1982]: time="2025-12-16T12:11:53.857343520Z" level=info msg="Daemon has completed initialization" Dec 16 12:11:53.857537 dockerd[1982]: time="2025-12-16T12:11:53.857410657Z" level=info msg="API listen on /run/docker.sock" Dec 16 12:11:53.857693 systemd[1]: Started docker.service - Docker Application Container Engine. Dec 16 12:11:53.856000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:11:55.006586 containerd[1662]: time="2025-12-16T12:11:55.006543488Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\"" Dec 16 12:11:55.783309 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1562894886.mount: Deactivated successfully. Dec 16 12:11:56.356411 containerd[1662]: time="2025-12-16T12:11:56.356333780Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:11:56.357752 containerd[1662]: time="2025-12-16T12:11:56.357713783Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.33.7: active requests=0, bytes read=25791094" Dec 16 12:11:56.359344 containerd[1662]: time="2025-12-16T12:11:56.359289508Z" level=info msg="ImageCreate event name:\"sha256:6d7bc8e445519fe4d49eee834f33f3e165eef4d3c0919ba08c67cdf8db905b7e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:11:56.362421 containerd[1662]: time="2025-12-16T12:11:56.362367556Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:11:56.364355 containerd[1662]: time="2025-12-16T12:11:56.364246602Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.33.7\" with image id \"sha256:6d7bc8e445519fe4d49eee834f33f3e165eef4d3c0919ba08c67cdf8db905b7e\", repo tag \"registry.k8s.io/kube-apiserver:v1.33.7\", repo digest \"registry.k8s.io/kube-apiserver@sha256:9585226cb85d1dc0f0ef5f7a75f04e4bc91ddd82de249533bd293aa3cf958dab\", size \"27383880\" in 1.357651317s" Dec 16 12:11:56.364355 containerd[1662]: time="2025-12-16T12:11:56.364328602Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.33.7\" returns image reference \"sha256:6d7bc8e445519fe4d49eee834f33f3e165eef4d3c0919ba08c67cdf8db905b7e\"" Dec 16 12:11:56.365876 containerd[1662]: time="2025-12-16T12:11:56.365848726Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\"" Dec 16 12:11:57.502789 containerd[1662]: time="2025-12-16T12:11:57.502224330Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:11:57.503588 containerd[1662]: time="2025-12-16T12:11:57.503520454Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.33.7: active requests=0, bytes read=23544927" Dec 16 12:11:57.504398 containerd[1662]: time="2025-12-16T12:11:57.504348736Z" level=info msg="ImageCreate event name:\"sha256:a94595d0240bcc5e538b4b33bbc890512a731425be69643cbee284072f7d8f64\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:11:57.507734 containerd[1662]: time="2025-12-16T12:11:57.507681386Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:11:57.508607 containerd[1662]: time="2025-12-16T12:11:57.508567668Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.33.7\" with image id \"sha256:a94595d0240bcc5e538b4b33bbc890512a731425be69643cbee284072f7d8f64\", repo tag \"registry.k8s.io/kube-controller-manager:v1.33.7\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:f69d77ca0626b5a4b7b432c18de0952941181db7341c80eb89731f46d1d0c230\", size \"25137562\" in 1.142691422s" Dec 16 12:11:57.508607 containerd[1662]: time="2025-12-16T12:11:57.508598988Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.33.7\" returns image reference \"sha256:a94595d0240bcc5e538b4b33bbc890512a731425be69643cbee284072f7d8f64\"" Dec 16 12:11:57.509146 containerd[1662]: time="2025-12-16T12:11:57.509035789Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\"" Dec 16 12:11:58.409428 containerd[1662]: time="2025-12-16T12:11:58.409351817Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:11:58.410607 containerd[1662]: time="2025-12-16T12:11:58.410560221Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.33.7: active requests=0, bytes read=0" Dec 16 12:11:58.411737 containerd[1662]: time="2025-12-16T12:11:58.411689024Z" level=info msg="ImageCreate event name:\"sha256:94005b6be50f054c8a4ef3f0d6976644e8b3c6a8bf15a9e8a2eeac3e8331b010\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:11:58.414175 containerd[1662]: time="2025-12-16T12:11:58.414148551Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:11:58.415072 containerd[1662]: time="2025-12-16T12:11:58.415038953Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.33.7\" with image id \"sha256:94005b6be50f054c8a4ef3f0d6976644e8b3c6a8bf15a9e8a2eeac3e8331b010\", repo tag \"registry.k8s.io/kube-scheduler:v1.33.7\", repo digest \"registry.k8s.io/kube-scheduler@sha256:21bda321d8b4d48eb059fbc1593203d55d8b3bc7acd0584e04e55504796d78d0\", size \"19882566\" in 905.970643ms" Dec 16 12:11:58.415072 containerd[1662]: time="2025-12-16T12:11:58.415061033Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.33.7\" returns image reference \"sha256:94005b6be50f054c8a4ef3f0d6976644e8b3c6a8bf15a9e8a2eeac3e8331b010\"" Dec 16 12:11:58.415590 containerd[1662]: time="2025-12-16T12:11:58.415539194Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\"" Dec 16 12:11:59.329888 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2505989656.mount: Deactivated successfully. Dec 16 12:11:59.584056 containerd[1662]: time="2025-12-16T12:11:59.583920048Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.33.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:11:59.587062 containerd[1662]: time="2025-12-16T12:11:59.586994457Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.33.7: active requests=0, bytes read=18413667" Dec 16 12:11:59.588189 containerd[1662]: time="2025-12-16T12:11:59.588146700Z" level=info msg="ImageCreate event name:\"sha256:78ccb937011a53894db229033fd54e237d478ec85315f8b08e5dcaa0f737111b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:11:59.590726 containerd[1662]: time="2025-12-16T12:11:59.590651987Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:11:59.591915 containerd[1662]: time="2025-12-16T12:11:59.591863870Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.33.7\" with image id \"sha256:78ccb937011a53894db229033fd54e237d478ec85315f8b08e5dcaa0f737111b\", repo tag \"registry.k8s.io/kube-proxy:v1.33.7\", repo digest \"registry.k8s.io/kube-proxy@sha256:ec25702b19026e9c0d339bc1c3bd231435a59f28b5fccb21e1b1078a357380f5\", size \"28257692\" in 1.176299796s" Dec 16 12:11:59.591915 containerd[1662]: time="2025-12-16T12:11:59.591897470Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.33.7\" returns image reference \"sha256:78ccb937011a53894db229033fd54e237d478ec85315f8b08e5dcaa0f737111b\"" Dec 16 12:11:59.592376 containerd[1662]: time="2025-12-16T12:11:59.592344752Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\"" Dec 16 12:12:00.241583 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2161407375.mount: Deactivated successfully. Dec 16 12:12:00.352452 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Dec 16 12:12:00.353942 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:12:00.509278 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:12:00.508000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:12:00.510306 kernel: kauditd_printk_skb: 132 callbacks suppressed Dec 16 12:12:00.510366 kernel: audit: type=1130 audit(1765887120.508:288): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:12:00.526084 (kubelet)[2304]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:12:00.557561 kubelet[2304]: E1216 12:12:00.557497 2304 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:12:00.559759 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:12:00.559906 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:12:00.561000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 12:12:00.561730 systemd[1]: kubelet.service: Consumed 139ms CPU time, 107.6M memory peak. Dec 16 12:12:00.565658 kernel: audit: type=1131 audit(1765887120.561:289): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 12:12:00.983589 containerd[1662]: time="2025-12-16T12:12:00.983147966Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:12:00.984084 containerd[1662]: time="2025-12-16T12:12:00.984036809Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.0: active requests=0, bytes read=18338344" Dec 16 12:12:00.985329 containerd[1662]: time="2025-12-16T12:12:00.985302972Z" level=info msg="ImageCreate event name:\"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:12:00.988623 containerd[1662]: time="2025-12-16T12:12:00.988572422Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:12:00.989656 containerd[1662]: time="2025-12-16T12:12:00.989492904Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.0\" with image id \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.0\", repo digest \"registry.k8s.io/coredns/coredns@sha256:40384aa1f5ea6bfdc77997d243aec73da05f27aed0c5e9d65bfa98933c519d97\", size \"19148915\" in 1.397117992s" Dec 16 12:12:00.989656 containerd[1662]: time="2025-12-16T12:12:00.989525584Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.0\" returns image reference \"sha256:f72407be9e08c3a1b29a88318cbfee87b9f2da489f84015a5090b1e386e4dbc1\"" Dec 16 12:12:00.990593 containerd[1662]: time="2025-12-16T12:12:00.990427627Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\"" Dec 16 12:12:01.541775 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3416778074.mount: Deactivated successfully. Dec 16 12:12:01.550038 containerd[1662]: time="2025-12-16T12:12:01.549995074Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:12:01.550981 containerd[1662]: time="2025-12-16T12:12:01.550929076Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Dec 16 12:12:01.552579 containerd[1662]: time="2025-12-16T12:12:01.552528601Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:12:01.555516 containerd[1662]: time="2025-12-16T12:12:01.555478889Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:12:01.556428 containerd[1662]: time="2025-12-16T12:12:01.556086131Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 565.630344ms" Dec 16 12:12:01.556428 containerd[1662]: time="2025-12-16T12:12:01.556115611Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10\" returns image reference \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\"" Dec 16 12:12:01.556556 containerd[1662]: time="2025-12-16T12:12:01.556533132Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\"" Dec 16 12:12:02.203941 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1931235965.mount: Deactivated successfully. Dec 16 12:12:04.238031 containerd[1662]: time="2025-12-16T12:12:04.237970721Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.5.21-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:12:04.239937 containerd[1662]: time="2025-12-16T12:12:04.239891966Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.5.21-0: active requests=0, bytes read=68134789" Dec 16 12:12:04.240986 containerd[1662]: time="2025-12-16T12:12:04.240943169Z" level=info msg="ImageCreate event name:\"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:12:04.244010 containerd[1662]: time="2025-12-16T12:12:04.243976777Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:12:04.245054 containerd[1662]: time="2025-12-16T12:12:04.245020180Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.5.21-0\" with image id \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\", repo tag \"registry.k8s.io/etcd:3.5.21-0\", repo digest \"registry.k8s.io/etcd@sha256:d58c035df557080a27387d687092e3fc2b64c6d0e3162dc51453a115f847d121\", size \"70026017\" in 2.688462248s" Dec 16 12:12:04.245054 containerd[1662]: time="2025-12-16T12:12:04.245049460Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.5.21-0\" returns image reference \"sha256:31747a36ce712f0bf61b50a0c06e99768522025e7b8daedd6dc63d1ae84837b5\"" Dec 16 12:12:09.587285 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:12:09.587000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:12:09.587842 systemd[1]: kubelet.service: Consumed 139ms CPU time, 107.6M memory peak. Dec 16 12:12:09.589782 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:12:09.587000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:12:09.593563 kernel: audit: type=1130 audit(1765887129.587:290): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:12:09.593649 kernel: audit: type=1131 audit(1765887129.587:291): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:12:09.613979 systemd[1]: Reload requested from client PID 2430 ('systemctl') (unit session-10.scope)... Dec 16 12:12:09.613999 systemd[1]: Reloading... Dec 16 12:12:09.683890 zram_generator::config[2478]: No configuration found. Dec 16 12:12:09.855149 systemd[1]: Reloading finished in 240 ms. Dec 16 12:12:09.887000 audit: BPF prog-id=63 op=LOAD Dec 16 12:12:09.888639 kernel: audit: type=1334 audit(1765887129.887:292): prog-id=63 op=LOAD Dec 16 12:12:09.888690 kernel: audit: type=1334 audit(1765887129.887:293): prog-id=43 op=UNLOAD Dec 16 12:12:09.887000 audit: BPF prog-id=43 op=UNLOAD Dec 16 12:12:09.888000 audit: BPF prog-id=64 op=LOAD Dec 16 12:12:09.890251 kernel: audit: type=1334 audit(1765887129.888:294): prog-id=64 op=LOAD Dec 16 12:12:09.888000 audit: BPF prog-id=65 op=LOAD Dec 16 12:12:09.888000 audit: BPF prog-id=44 op=UNLOAD Dec 16 12:12:09.895217 kernel: audit: type=1334 audit(1765887129.888:295): prog-id=65 op=LOAD Dec 16 12:12:09.895252 kernel: audit: type=1334 audit(1765887129.888:296): prog-id=44 op=UNLOAD Dec 16 12:12:09.895280 kernel: audit: type=1334 audit(1765887129.888:297): prog-id=45 op=UNLOAD Dec 16 12:12:09.888000 audit: BPF prog-id=45 op=UNLOAD Dec 16 12:12:09.889000 audit: BPF prog-id=66 op=LOAD Dec 16 12:12:09.896831 kernel: audit: type=1334 audit(1765887129.889:298): prog-id=66 op=LOAD Dec 16 12:12:09.896858 kernel: audit: type=1334 audit(1765887129.889:299): prog-id=46 op=UNLOAD Dec 16 12:12:09.889000 audit: BPF prog-id=46 op=UNLOAD Dec 16 12:12:09.889000 audit: BPF prog-id=67 op=LOAD Dec 16 12:12:09.889000 audit: BPF prog-id=68 op=LOAD Dec 16 12:12:09.889000 audit: BPF prog-id=47 op=UNLOAD Dec 16 12:12:09.889000 audit: BPF prog-id=48 op=UNLOAD Dec 16 12:12:09.890000 audit: BPF prog-id=69 op=LOAD Dec 16 12:12:09.890000 audit: BPF prog-id=58 op=UNLOAD Dec 16 12:12:09.891000 audit: BPF prog-id=70 op=LOAD Dec 16 12:12:09.891000 audit: BPF prog-id=60 op=UNLOAD Dec 16 12:12:09.892000 audit: BPF prog-id=71 op=LOAD Dec 16 12:12:09.892000 audit: BPF prog-id=72 op=LOAD Dec 16 12:12:09.892000 audit: BPF prog-id=61 op=UNLOAD Dec 16 12:12:09.892000 audit: BPF prog-id=62 op=UNLOAD Dec 16 12:12:09.892000 audit: BPF prog-id=73 op=LOAD Dec 16 12:12:09.892000 audit: BPF prog-id=49 op=UNLOAD Dec 16 12:12:09.892000 audit: BPF prog-id=74 op=LOAD Dec 16 12:12:09.892000 audit: BPF prog-id=75 op=LOAD Dec 16 12:12:09.892000 audit: BPF prog-id=50 op=UNLOAD Dec 16 12:12:09.892000 audit: BPF prog-id=51 op=UNLOAD Dec 16 12:12:09.893000 audit: BPF prog-id=76 op=LOAD Dec 16 12:12:09.894000 audit: BPF prog-id=77 op=LOAD Dec 16 12:12:09.894000 audit: BPF prog-id=52 op=UNLOAD Dec 16 12:12:09.894000 audit: BPF prog-id=53 op=UNLOAD Dec 16 12:12:09.895000 audit: BPF prog-id=78 op=LOAD Dec 16 12:12:09.895000 audit: BPF prog-id=57 op=UNLOAD Dec 16 12:12:09.913000 audit: BPF prog-id=79 op=LOAD Dec 16 12:12:09.913000 audit: BPF prog-id=59 op=UNLOAD Dec 16 12:12:09.913000 audit: BPF prog-id=80 op=LOAD Dec 16 12:12:09.913000 audit: BPF prog-id=54 op=UNLOAD Dec 16 12:12:09.913000 audit: BPF prog-id=81 op=LOAD Dec 16 12:12:09.913000 audit: BPF prog-id=82 op=LOAD Dec 16 12:12:09.913000 audit: BPF prog-id=55 op=UNLOAD Dec 16 12:12:09.913000 audit: BPF prog-id=56 op=UNLOAD Dec 16 12:12:09.934570 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Dec 16 12:12:09.934673 systemd[1]: kubelet.service: Failed with result 'signal'. Dec 16 12:12:09.934994 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:12:09.934000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 12:12:09.935052 systemd[1]: kubelet.service: Consumed 90ms CPU time, 95.2M memory peak. Dec 16 12:12:09.936614 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:12:10.055000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:12:10.055570 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:12:10.061133 (kubelet)[2523]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 16 12:12:10.091381 kubelet[2523]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 12:12:10.091381 kubelet[2523]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 16 12:12:10.091381 kubelet[2523]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 12:12:10.091722 kubelet[2523]: I1216 12:12:10.091423 2523 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 12:12:11.626658 kubelet[2523]: I1216 12:12:11.625930 2523 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Dec 16 12:12:11.626658 kubelet[2523]: I1216 12:12:11.625962 2523 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 12:12:11.626658 kubelet[2523]: I1216 12:12:11.626165 2523 server.go:956] "Client rotation is on, will bootstrap in background" Dec 16 12:12:11.677474 kubelet[2523]: E1216 12:12:11.677424 2523 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://10.0.21.106:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 10.0.21.106:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Dec 16 12:12:11.683646 kubelet[2523]: I1216 12:12:11.683532 2523 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 12:12:11.684697 update_engine[1647]: I20251216 12:12:11.684617 1647 update_attempter.cc:509] Updating boot flags... Dec 16 12:12:11.698118 kubelet[2523]: I1216 12:12:11.698063 2523 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 12:12:11.700809 kubelet[2523]: I1216 12:12:11.700744 2523 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 16 12:12:11.705569 kubelet[2523]: I1216 12:12:11.705448 2523 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 12:12:11.705697 kubelet[2523]: I1216 12:12:11.705509 2523 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4547-0-0-4-c6e23b3406","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 12:12:11.705794 kubelet[2523]: I1216 12:12:11.705779 2523 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 12:12:11.705794 kubelet[2523]: I1216 12:12:11.705792 2523 container_manager_linux.go:303] "Creating device plugin manager" Dec 16 12:12:11.707055 kubelet[2523]: I1216 12:12:11.706996 2523 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:12:11.719057 kubelet[2523]: I1216 12:12:11.719029 2523 kubelet.go:480] "Attempting to sync node with API server" Dec 16 12:12:11.719057 kubelet[2523]: I1216 12:12:11.719058 2523 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 12:12:11.719167 kubelet[2523]: I1216 12:12:11.719085 2523 kubelet.go:386] "Adding apiserver pod source" Dec 16 12:12:11.724074 kubelet[2523]: I1216 12:12:11.724048 2523 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 12:12:11.726404 kubelet[2523]: E1216 12:12:11.724978 2523 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: Get \"https://10.0.21.106:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 10.0.21.106:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Dec 16 12:12:11.728820 kubelet[2523]: I1216 12:12:11.728749 2523 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Dec 16 12:12:11.728820 kubelet[2523]: E1216 12:12:11.728807 2523 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: Get \"https://10.0.21.106:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4547-0-0-4-c6e23b3406&limit=500&resourceVersion=0\": dial tcp 10.0.21.106:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Dec 16 12:12:11.731052 kubelet[2523]: I1216 12:12:11.731016 2523 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 16 12:12:11.731795 kubelet[2523]: W1216 12:12:11.731173 2523 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Dec 16 12:12:11.739644 kubelet[2523]: I1216 12:12:11.734622 2523 watchdog_linux.go:99] "Systemd watchdog is not enabled" Dec 16 12:12:11.739644 kubelet[2523]: I1216 12:12:11.734687 2523 server.go:1289] "Started kubelet" Dec 16 12:12:11.739644 kubelet[2523]: I1216 12:12:11.736553 2523 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 12:12:11.741632 kubelet[2523]: E1216 12:12:11.740010 2523 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://10.0.21.106:6443/api/v1/namespaces/default/events\": dial tcp 10.0.21.106:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4547-0-0-4-c6e23b3406.1881b0fd807a4c79 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4547-0-0-4-c6e23b3406,UID:ci-4547-0-0-4-c6e23b3406,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4547-0-0-4-c6e23b3406,},FirstTimestamp:2025-12-16 12:12:11.734658169 +0000 UTC m=+1.670512694,LastTimestamp:2025-12-16 12:12:11.734658169 +0000 UTC m=+1.670512694,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4547-0-0-4-c6e23b3406,}" Dec 16 12:12:11.745212 kubelet[2523]: I1216 12:12:11.745142 2523 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 12:12:11.745461 kubelet[2523]: I1216 12:12:11.745436 2523 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 12:12:11.746756 kubelet[2523]: I1216 12:12:11.746529 2523 server.go:317] "Adding debug handlers to kubelet server" Dec 16 12:12:11.747664 kubelet[2523]: I1216 12:12:11.747465 2523 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 12:12:11.748892 kubelet[2523]: I1216 12:12:11.748845 2523 volume_manager.go:297] "Starting Kubelet Volume Manager" Dec 16 12:12:11.748958 kubelet[2523]: E1216 12:12:11.748939 2523 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-0-0-4-c6e23b3406\" not found" Dec 16 12:12:11.749105 kubelet[2523]: I1216 12:12:11.749085 2523 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 16 12:12:11.749229 kubelet[2523]: I1216 12:12:11.749203 2523 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Dec 16 12:12:11.749280 kubelet[2523]: I1216 12:12:11.749259 2523 reconciler.go:26] "Reconciler: start to sync state" Dec 16 12:12:11.750111 kubelet[2523]: E1216 12:12:11.750075 2523 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://10.0.21.106:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 10.0.21.106:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Dec 16 12:12:11.750236 kubelet[2523]: E1216 12:12:11.750144 2523 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.21.106:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-4-c6e23b3406?timeout=10s\": dial tcp 10.0.21.106:6443: connect: connection refused" interval="200ms" Dec 16 12:12:11.750856 kubelet[2523]: I1216 12:12:11.750824 2523 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 16 12:12:11.754141 kubelet[2523]: I1216 12:12:11.754108 2523 factory.go:223] Registration of the containerd container factory successfully Dec 16 12:12:11.754141 kubelet[2523]: I1216 12:12:11.754133 2523 factory.go:223] Registration of the systemd container factory successfully Dec 16 12:12:11.755423 kubelet[2523]: E1216 12:12:11.755387 2523 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 16 12:12:11.759000 audit[2556]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2556 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:11.759000 audit[2556]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffce1ae940 a2=0 a3=0 items=0 ppid=2523 pid=2556 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:11.759000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Dec 16 12:12:11.763000 audit[2558]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2558 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:11.763000 audit[2558]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc6879e30 a2=0 a3=0 items=0 ppid=2523 pid=2558 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:11.763000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Dec 16 12:12:11.767000 audit[2560]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2560 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:11.767000 audit[2560]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffe40dfe60 a2=0 a3=0 items=0 ppid=2523 pid=2560 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:11.767000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:12:11.774000 audit[2563]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2563 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:11.774000 audit[2563]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffedc39420 a2=0 a3=0 items=0 ppid=2523 pid=2563 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:11.774000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:12:11.782804 kubelet[2523]: I1216 12:12:11.781977 2523 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 16 12:12:11.782883 kubelet[2523]: I1216 12:12:11.782823 2523 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 16 12:12:11.782883 kubelet[2523]: I1216 12:12:11.782844 2523 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:12:11.787000 audit[2566]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2566 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:11.787000 audit[2566]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=924 a0=3 a1=ffffc9af5150 a2=0 a3=0 items=0 ppid=2523 pid=2566 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:11.787000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F38 Dec 16 12:12:11.790806 kubelet[2523]: I1216 12:12:11.790751 2523 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Dec 16 12:12:11.791092 kubelet[2523]: I1216 12:12:11.791016 2523 policy_none.go:49] "None policy: Start" Dec 16 12:12:11.791092 kubelet[2523]: I1216 12:12:11.791034 2523 memory_manager.go:186] "Starting memorymanager" policy="None" Dec 16 12:12:11.791092 kubelet[2523]: I1216 12:12:11.791045 2523 state_mem.go:35] "Initializing new in-memory state store" Dec 16 12:12:11.791000 audit[2567]: NETFILTER_CFG table=mangle:47 family=10 entries=2 op=nft_register_chain pid=2567 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:11.791000 audit[2567]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=ffffc617f020 a2=0 a3=0 items=0 ppid=2523 pid=2567 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:11.791000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Dec 16 12:12:11.794000 audit[2568]: NETFILTER_CFG table=mangle:48 family=2 entries=1 op=nft_register_chain pid=2568 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:11.794000 audit[2568]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffd10640f0 a2=0 a3=0 items=0 ppid=2523 pid=2568 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:11.794000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Dec 16 12:12:11.795951 kubelet[2523]: I1216 12:12:11.795743 2523 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Dec 16 12:12:11.795951 kubelet[2523]: I1216 12:12:11.795770 2523 status_manager.go:230] "Starting to sync pod status with apiserver" Dec 16 12:12:11.795951 kubelet[2523]: I1216 12:12:11.795792 2523 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 16 12:12:11.795951 kubelet[2523]: I1216 12:12:11.795800 2523 kubelet.go:2436] "Starting kubelet main sync loop" Dec 16 12:12:11.795951 kubelet[2523]: E1216 12:12:11.795851 2523 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 12:12:11.799097 kubelet[2523]: E1216 12:12:11.799026 2523 reflector.go:200] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://10.0.21.106:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 10.0.21.106:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Dec 16 12:12:11.799000 audit[2570]: NETFILTER_CFG table=mangle:49 family=10 entries=1 op=nft_register_chain pid=2570 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:11.799000 audit[2570]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffd4632d30 a2=0 a3=0 items=0 ppid=2523 pid=2570 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:11.799000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Dec 16 12:12:11.799000 audit[2571]: NETFILTER_CFG table=nat:50 family=2 entries=1 op=nft_register_chain pid=2571 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:11.799000 audit[2571]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffffd2aa650 a2=0 a3=0 items=0 ppid=2523 pid=2571 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:11.799000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Dec 16 12:12:11.800000 audit[2573]: NETFILTER_CFG table=filter:51 family=2 entries=1 op=nft_register_chain pid=2573 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:11.800000 audit[2573]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffcaf6b420 a2=0 a3=0 items=0 ppid=2523 pid=2573 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:11.800000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Dec 16 12:12:11.800000 audit[2572]: NETFILTER_CFG table=nat:52 family=10 entries=1 op=nft_register_chain pid=2572 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:11.800000 audit[2572]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd39a20a0 a2=0 a3=0 items=0 ppid=2523 pid=2572 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:11.800000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Dec 16 12:12:11.801000 audit[2574]: NETFILTER_CFG table=filter:53 family=10 entries=1 op=nft_register_chain pid=2574 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:11.801000 audit[2574]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff4478160 a2=0 a3=0 items=0 ppid=2523 pid=2574 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:11.801000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Dec 16 12:12:11.812876 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Dec 16 12:12:11.831420 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Dec 16 12:12:11.834184 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Dec 16 12:12:11.850121 kubelet[2523]: E1216 12:12:11.850064 2523 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-0-0-4-c6e23b3406\" not found" Dec 16 12:12:11.855090 kubelet[2523]: E1216 12:12:11.855009 2523 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 16 12:12:11.855215 kubelet[2523]: I1216 12:12:11.855193 2523 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 12:12:11.855256 kubelet[2523]: I1216 12:12:11.855211 2523 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 12:12:11.855738 kubelet[2523]: I1216 12:12:11.855720 2523 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 12:12:11.856466 kubelet[2523]: E1216 12:12:11.856445 2523 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 16 12:12:11.856533 kubelet[2523]: E1216 12:12:11.856484 2523 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4547-0-0-4-c6e23b3406\" not found" Dec 16 12:12:11.906749 systemd[1]: Created slice kubepods-burstable-podb96fec26af0eefb5af6b8a162b45a3e5.slice - libcontainer container kubepods-burstable-podb96fec26af0eefb5af6b8a162b45a3e5.slice. Dec 16 12:12:11.920147 kubelet[2523]: E1216 12:12:11.920083 2523 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-4-c6e23b3406\" not found" node="ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:11.923124 systemd[1]: Created slice kubepods-burstable-podab47c6a6003114c72e61300e646e4520.slice - libcontainer container kubepods-burstable-podab47c6a6003114c72e61300e646e4520.slice. Dec 16 12:12:11.926613 kubelet[2523]: E1216 12:12:11.926577 2523 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-4-c6e23b3406\" not found" node="ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:11.927821 systemd[1]: Created slice kubepods-burstable-pod089c865778d94423bfd6c7ecfbcac378.slice - libcontainer container kubepods-burstable-pod089c865778d94423bfd6c7ecfbcac378.slice. Dec 16 12:12:11.931507 kubelet[2523]: E1216 12:12:11.931482 2523 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-4-c6e23b3406\" not found" node="ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:11.950790 kubelet[2523]: I1216 12:12:11.950766 2523 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/089c865778d94423bfd6c7ecfbcac378-k8s-certs\") pod \"kube-controller-manager-ci-4547-0-0-4-c6e23b3406\" (UID: \"089c865778d94423bfd6c7ecfbcac378\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:11.950844 kubelet[2523]: I1216 12:12:11.950801 2523 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b96fec26af0eefb5af6b8a162b45a3e5-kubeconfig\") pod \"kube-scheduler-ci-4547-0-0-4-c6e23b3406\" (UID: \"b96fec26af0eefb5af6b8a162b45a3e5\") " pod="kube-system/kube-scheduler-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:11.950844 kubelet[2523]: I1216 12:12:11.950819 2523 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ab47c6a6003114c72e61300e646e4520-k8s-certs\") pod \"kube-apiserver-ci-4547-0-0-4-c6e23b3406\" (UID: \"ab47c6a6003114c72e61300e646e4520\") " pod="kube-system/kube-apiserver-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:11.950844 kubelet[2523]: I1216 12:12:11.950833 2523 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ab47c6a6003114c72e61300e646e4520-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4547-0-0-4-c6e23b3406\" (UID: \"ab47c6a6003114c72e61300e646e4520\") " pod="kube-system/kube-apiserver-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:11.950924 kubelet[2523]: I1216 12:12:11.950860 2523 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/089c865778d94423bfd6c7ecfbcac378-ca-certs\") pod \"kube-controller-manager-ci-4547-0-0-4-c6e23b3406\" (UID: \"089c865778d94423bfd6c7ecfbcac378\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:11.950924 kubelet[2523]: I1216 12:12:11.950910 2523 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/089c865778d94423bfd6c7ecfbcac378-flexvolume-dir\") pod \"kube-controller-manager-ci-4547-0-0-4-c6e23b3406\" (UID: \"089c865778d94423bfd6c7ecfbcac378\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:11.950969 kubelet[2523]: I1216 12:12:11.950955 2523 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/089c865778d94423bfd6c7ecfbcac378-kubeconfig\") pod \"kube-controller-manager-ci-4547-0-0-4-c6e23b3406\" (UID: \"089c865778d94423bfd6c7ecfbcac378\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:11.951148 kubelet[2523]: I1216 12:12:11.951000 2523 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/089c865778d94423bfd6c7ecfbcac378-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4547-0-0-4-c6e23b3406\" (UID: \"089c865778d94423bfd6c7ecfbcac378\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:11.951148 kubelet[2523]: I1216 12:12:11.951055 2523 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ab47c6a6003114c72e61300e646e4520-ca-certs\") pod \"kube-apiserver-ci-4547-0-0-4-c6e23b3406\" (UID: \"ab47c6a6003114c72e61300e646e4520\") " pod="kube-system/kube-apiserver-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:11.951148 kubelet[2523]: E1216 12:12:11.951099 2523 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.21.106:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-4-c6e23b3406?timeout=10s\": dial tcp 10.0.21.106:6443: connect: connection refused" interval="400ms" Dec 16 12:12:11.958108 kubelet[2523]: I1216 12:12:11.958048 2523 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:11.958420 kubelet[2523]: E1216 12:12:11.958383 2523 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.21.106:6443/api/v1/nodes\": dial tcp 10.0.21.106:6443: connect: connection refused" node="ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:12.161247 kubelet[2523]: I1216 12:12:12.161148 2523 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:12.161752 kubelet[2523]: E1216 12:12:12.161725 2523 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://10.0.21.106:6443/api/v1/nodes\": dial tcp 10.0.21.106:6443: connect: connection refused" node="ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:12.221865 containerd[1662]: time="2025-12-16T12:12:12.221815686Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4547-0-0-4-c6e23b3406,Uid:b96fec26af0eefb5af6b8a162b45a3e5,Namespace:kube-system,Attempt:0,}" Dec 16 12:12:12.227795 containerd[1662]: time="2025-12-16T12:12:12.227763343Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4547-0-0-4-c6e23b3406,Uid:ab47c6a6003114c72e61300e646e4520,Namespace:kube-system,Attempt:0,}" Dec 16 12:12:12.232519 containerd[1662]: time="2025-12-16T12:12:12.232492756Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4547-0-0-4-c6e23b3406,Uid:089c865778d94423bfd6c7ecfbcac378,Namespace:kube-system,Attempt:0,}" Dec 16 12:12:12.250142 containerd[1662]: time="2025-12-16T12:12:12.250096765Z" level=info msg="connecting to shim d835d90b56b547c926ecd00c51e5c10b0f38fb0cf9d6bdb07a1a859d00bf7fbc" address="unix:///run/containerd/s/44effb07867acd3796bb793fad9e9b9ed68813e6abb744acc4d234fc9e1337d7" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:12:12.273071 containerd[1662]: time="2025-12-16T12:12:12.273027349Z" level=info msg="connecting to shim 82eee5316149c61c23c824339add149c9aa0f7e34d26d8088e6fd5b8e2573101" address="unix:///run/containerd/s/02af43b339cbc5f77254b627fc69bc653adcd384841a954e77d589d14a775ddf" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:12:12.276898 systemd[1]: Started cri-containerd-d835d90b56b547c926ecd00c51e5c10b0f38fb0cf9d6bdb07a1a859d00bf7fbc.scope - libcontainer container d835d90b56b547c926ecd00c51e5c10b0f38fb0cf9d6bdb07a1a859d00bf7fbc. Dec 16 12:12:12.282964 containerd[1662]: time="2025-12-16T12:12:12.282895936Z" level=info msg="connecting to shim f3c949d5eb2de06768a20d3a1fbdecb5ff381edaa57bbe989a906de9099d1dd3" address="unix:///run/containerd/s/5d9cfee8c3fc0fabcf736b39c7b4a783ada94d56625febe33804a190c85a7405" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:12:12.293000 audit: BPF prog-id=83 op=LOAD Dec 16 12:12:12.294000 audit: BPF prog-id=84 op=LOAD Dec 16 12:12:12.294000 audit[2595]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2583 pid=2595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.294000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438333564393062353662353437633932366563643030633531653563 Dec 16 12:12:12.294000 audit: BPF prog-id=84 op=UNLOAD Dec 16 12:12:12.294000 audit[2595]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2583 pid=2595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.294000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438333564393062353662353437633932366563643030633531653563 Dec 16 12:12:12.294000 audit: BPF prog-id=85 op=LOAD Dec 16 12:12:12.294000 audit[2595]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2583 pid=2595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.294000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438333564393062353662353437633932366563643030633531653563 Dec 16 12:12:12.295000 audit: BPF prog-id=86 op=LOAD Dec 16 12:12:12.295000 audit[2595]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2583 pid=2595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.295000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438333564393062353662353437633932366563643030633531653563 Dec 16 12:12:12.295000 audit: BPF prog-id=86 op=UNLOAD Dec 16 12:12:12.295000 audit[2595]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2583 pid=2595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.295000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438333564393062353662353437633932366563643030633531653563 Dec 16 12:12:12.295000 audit: BPF prog-id=85 op=UNLOAD Dec 16 12:12:12.295000 audit[2595]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2583 pid=2595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.295000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438333564393062353662353437633932366563643030633531653563 Dec 16 12:12:12.295000 audit: BPF prog-id=87 op=LOAD Dec 16 12:12:12.295000 audit[2595]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2583 pid=2595 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.295000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438333564393062353662353437633932366563643030633531653563 Dec 16 12:12:12.302808 systemd[1]: Started cri-containerd-82eee5316149c61c23c824339add149c9aa0f7e34d26d8088e6fd5b8e2573101.scope - libcontainer container 82eee5316149c61c23c824339add149c9aa0f7e34d26d8088e6fd5b8e2573101. Dec 16 12:12:12.306231 systemd[1]: Started cri-containerd-f3c949d5eb2de06768a20d3a1fbdecb5ff381edaa57bbe989a906de9099d1dd3.scope - libcontainer container f3c949d5eb2de06768a20d3a1fbdecb5ff381edaa57bbe989a906de9099d1dd3. Dec 16 12:12:12.316000 audit: BPF prog-id=88 op=LOAD Dec 16 12:12:12.317000 audit: BPF prog-id=89 op=LOAD Dec 16 12:12:12.317000 audit[2638]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2614 pid=2638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.317000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3832656565353331363134396336316332336338323433333961646431 Dec 16 12:12:12.317000 audit: BPF prog-id=89 op=UNLOAD Dec 16 12:12:12.317000 audit[2638]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2614 pid=2638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.317000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3832656565353331363134396336316332336338323433333961646431 Dec 16 12:12:12.317000 audit: BPF prog-id=90 op=LOAD Dec 16 12:12:12.317000 audit[2638]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2614 pid=2638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.317000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3832656565353331363134396336316332336338323433333961646431 Dec 16 12:12:12.317000 audit: BPF prog-id=91 op=LOAD Dec 16 12:12:12.317000 audit[2638]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2614 pid=2638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.317000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3832656565353331363134396336316332336338323433333961646431 Dec 16 12:12:12.317000 audit: BPF prog-id=91 op=UNLOAD Dec 16 12:12:12.317000 audit[2638]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2614 pid=2638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.317000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3832656565353331363134396336316332336338323433333961646431 Dec 16 12:12:12.317000 audit: BPF prog-id=90 op=UNLOAD Dec 16 12:12:12.317000 audit[2638]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2614 pid=2638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.317000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3832656565353331363134396336316332336338323433333961646431 Dec 16 12:12:12.317000 audit: BPF prog-id=92 op=LOAD Dec 16 12:12:12.317000 audit[2638]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2614 pid=2638 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.317000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3832656565353331363134396336316332336338323433333961646431 Dec 16 12:12:12.320000 audit: BPF prog-id=93 op=LOAD Dec 16 12:12:12.320000 audit: BPF prog-id=94 op=LOAD Dec 16 12:12:12.320000 audit[2663]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2633 pid=2663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.320000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633633934396435656232646530363736386132306433613166626465 Dec 16 12:12:12.321000 audit: BPF prog-id=94 op=UNLOAD Dec 16 12:12:12.321000 audit[2663]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2633 pid=2663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.321000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633633934396435656232646530363736386132306433613166626465 Dec 16 12:12:12.321000 audit: BPF prog-id=95 op=LOAD Dec 16 12:12:12.321000 audit[2663]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2633 pid=2663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.321000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633633934396435656232646530363736386132306433613166626465 Dec 16 12:12:12.321000 audit: BPF prog-id=96 op=LOAD Dec 16 12:12:12.321000 audit[2663]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2633 pid=2663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.321000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633633934396435656232646530363736386132306433613166626465 Dec 16 12:12:12.321000 audit: BPF prog-id=96 op=UNLOAD Dec 16 12:12:12.321000 audit[2663]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2633 pid=2663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.321000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633633934396435656232646530363736386132306433613166626465 Dec 16 12:12:12.321000 audit: BPF prog-id=95 op=UNLOAD Dec 16 12:12:12.321000 audit[2663]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2633 pid=2663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.321000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633633934396435656232646530363736386132306433613166626465 Dec 16 12:12:12.321000 audit: BPF prog-id=97 op=LOAD Dec 16 12:12:12.321000 audit[2663]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2633 pid=2663 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.321000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633633934396435656232646530363736386132306433613166626465 Dec 16 12:12:12.328459 containerd[1662]: time="2025-12-16T12:12:12.328316823Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4547-0-0-4-c6e23b3406,Uid:b96fec26af0eefb5af6b8a162b45a3e5,Namespace:kube-system,Attempt:0,} returns sandbox id \"d835d90b56b547c926ecd00c51e5c10b0f38fb0cf9d6bdb07a1a859d00bf7fbc\"" Dec 16 12:12:12.335917 containerd[1662]: time="2025-12-16T12:12:12.335864764Z" level=info msg="CreateContainer within sandbox \"d835d90b56b547c926ecd00c51e5c10b0f38fb0cf9d6bdb07a1a859d00bf7fbc\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Dec 16 12:12:12.345375 containerd[1662]: time="2025-12-16T12:12:12.345291630Z" level=info msg="Container d8fda76280e91957933b5c537b46a2581e6bdb2c7d054be2f6ae8edc941b6d86: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:12:12.347156 containerd[1662]: time="2025-12-16T12:12:12.347120355Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4547-0-0-4-c6e23b3406,Uid:ab47c6a6003114c72e61300e646e4520,Namespace:kube-system,Attempt:0,} returns sandbox id \"82eee5316149c61c23c824339add149c9aa0f7e34d26d8088e6fd5b8e2573101\"" Dec 16 12:12:12.351367 containerd[1662]: time="2025-12-16T12:12:12.350902806Z" level=info msg="CreateContainer within sandbox \"82eee5316149c61c23c824339add149c9aa0f7e34d26d8088e6fd5b8e2573101\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Dec 16 12:12:12.352595 kubelet[2523]: E1216 12:12:12.352555 2523 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://10.0.21.106:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-4-c6e23b3406?timeout=10s\": dial tcp 10.0.21.106:6443: connect: connection refused" interval="800ms" Dec 16 12:12:12.354013 containerd[1662]: time="2025-12-16T12:12:12.353931774Z" level=info msg="CreateContainer within sandbox \"d835d90b56b547c926ecd00c51e5c10b0f38fb0cf9d6bdb07a1a859d00bf7fbc\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"d8fda76280e91957933b5c537b46a2581e6bdb2c7d054be2f6ae8edc941b6d86\"" Dec 16 12:12:12.354674 containerd[1662]: time="2025-12-16T12:12:12.354647856Z" level=info msg="StartContainer for \"d8fda76280e91957933b5c537b46a2581e6bdb2c7d054be2f6ae8edc941b6d86\"" Dec 16 12:12:12.355795 containerd[1662]: time="2025-12-16T12:12:12.355765339Z" level=info msg="connecting to shim d8fda76280e91957933b5c537b46a2581e6bdb2c7d054be2f6ae8edc941b6d86" address="unix:///run/containerd/s/44effb07867acd3796bb793fad9e9b9ed68813e6abb744acc4d234fc9e1337d7" protocol=ttrpc version=3 Dec 16 12:12:12.361375 containerd[1662]: time="2025-12-16T12:12:12.361305515Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4547-0-0-4-c6e23b3406,Uid:089c865778d94423bfd6c7ecfbcac378,Namespace:kube-system,Attempt:0,} returns sandbox id \"f3c949d5eb2de06768a20d3a1fbdecb5ff381edaa57bbe989a906de9099d1dd3\"" Dec 16 12:12:12.364118 containerd[1662]: time="2025-12-16T12:12:12.364088163Z" level=info msg="Container f3595f8f5d1c3696e1eb431ee23a527d2d0ffd02b26cd21b85ac85b917f7282e: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:12:12.367148 containerd[1662]: time="2025-12-16T12:12:12.367048811Z" level=info msg="CreateContainer within sandbox \"f3c949d5eb2de06768a20d3a1fbdecb5ff381edaa57bbe989a906de9099d1dd3\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Dec 16 12:12:12.372720 containerd[1662]: time="2025-12-16T12:12:12.372675507Z" level=info msg="CreateContainer within sandbox \"82eee5316149c61c23c824339add149c9aa0f7e34d26d8088e6fd5b8e2573101\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"f3595f8f5d1c3696e1eb431ee23a527d2d0ffd02b26cd21b85ac85b917f7282e\"" Dec 16 12:12:12.373623 containerd[1662]: time="2025-12-16T12:12:12.373575669Z" level=info msg="StartContainer for \"f3595f8f5d1c3696e1eb431ee23a527d2d0ffd02b26cd21b85ac85b917f7282e\"" Dec 16 12:12:12.375129 containerd[1662]: time="2025-12-16T12:12:12.375104673Z" level=info msg="connecting to shim f3595f8f5d1c3696e1eb431ee23a527d2d0ffd02b26cd21b85ac85b917f7282e" address="unix:///run/containerd/s/02af43b339cbc5f77254b627fc69bc653adcd384841a954e77d589d14a775ddf" protocol=ttrpc version=3 Dec 16 12:12:12.377677 containerd[1662]: time="2025-12-16T12:12:12.377640600Z" level=info msg="Container 5294eb4ec96867df0758048164e3b9ca486f504dc6ff6d548203471650c69433: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:12:12.379875 systemd[1]: Started cri-containerd-d8fda76280e91957933b5c537b46a2581e6bdb2c7d054be2f6ae8edc941b6d86.scope - libcontainer container d8fda76280e91957933b5c537b46a2581e6bdb2c7d054be2f6ae8edc941b6d86. Dec 16 12:12:12.390447 containerd[1662]: time="2025-12-16T12:12:12.390383156Z" level=info msg="CreateContainer within sandbox \"f3c949d5eb2de06768a20d3a1fbdecb5ff381edaa57bbe989a906de9099d1dd3\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"5294eb4ec96867df0758048164e3b9ca486f504dc6ff6d548203471650c69433\"" Dec 16 12:12:12.392059 containerd[1662]: time="2025-12-16T12:12:12.392022600Z" level=info msg="StartContainer for \"5294eb4ec96867df0758048164e3b9ca486f504dc6ff6d548203471650c69433\"" Dec 16 12:12:12.393830 containerd[1662]: time="2025-12-16T12:12:12.393802525Z" level=info msg="connecting to shim 5294eb4ec96867df0758048164e3b9ca486f504dc6ff6d548203471650c69433" address="unix:///run/containerd/s/5d9cfee8c3fc0fabcf736b39c7b4a783ada94d56625febe33804a190c85a7405" protocol=ttrpc version=3 Dec 16 12:12:12.397855 systemd[1]: Started cri-containerd-f3595f8f5d1c3696e1eb431ee23a527d2d0ffd02b26cd21b85ac85b917f7282e.scope - libcontainer container f3595f8f5d1c3696e1eb431ee23a527d2d0ffd02b26cd21b85ac85b917f7282e. Dec 16 12:12:12.399000 audit: BPF prog-id=98 op=LOAD Dec 16 12:12:12.400000 audit: BPF prog-id=99 op=LOAD Dec 16 12:12:12.400000 audit[2715]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2583 pid=2715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.400000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438666461373632383065393139353739333362356335333762343661 Dec 16 12:12:12.400000 audit: BPF prog-id=99 op=UNLOAD Dec 16 12:12:12.400000 audit[2715]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2583 pid=2715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.400000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438666461373632383065393139353739333362356335333762343661 Dec 16 12:12:12.400000 audit: BPF prog-id=100 op=LOAD Dec 16 12:12:12.400000 audit[2715]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2583 pid=2715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.400000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438666461373632383065393139353739333362356335333762343661 Dec 16 12:12:12.400000 audit: BPF prog-id=101 op=LOAD Dec 16 12:12:12.400000 audit[2715]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2583 pid=2715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.400000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438666461373632383065393139353739333362356335333762343661 Dec 16 12:12:12.400000 audit: BPF prog-id=101 op=UNLOAD Dec 16 12:12:12.400000 audit[2715]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2583 pid=2715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.400000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438666461373632383065393139353739333362356335333762343661 Dec 16 12:12:12.400000 audit: BPF prog-id=100 op=UNLOAD Dec 16 12:12:12.400000 audit[2715]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2583 pid=2715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.400000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438666461373632383065393139353739333362356335333762343661 Dec 16 12:12:12.400000 audit: BPF prog-id=102 op=LOAD Dec 16 12:12:12.400000 audit[2715]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2583 pid=2715 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.400000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6438666461373632383065393139353739333362356335333762343661 Dec 16 12:12:12.413809 systemd[1]: Started cri-containerd-5294eb4ec96867df0758048164e3b9ca486f504dc6ff6d548203471650c69433.scope - libcontainer container 5294eb4ec96867df0758048164e3b9ca486f504dc6ff6d548203471650c69433. Dec 16 12:12:12.416000 audit: BPF prog-id=103 op=LOAD Dec 16 12:12:12.417000 audit: BPF prog-id=104 op=LOAD Dec 16 12:12:12.417000 audit[2729]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2614 pid=2729 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.417000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633353935663866356431633336393665316562343331656532336135 Dec 16 12:12:12.418000 audit: BPF prog-id=104 op=UNLOAD Dec 16 12:12:12.418000 audit[2729]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2614 pid=2729 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633353935663866356431633336393665316562343331656532336135 Dec 16 12:12:12.418000 audit: BPF prog-id=105 op=LOAD Dec 16 12:12:12.418000 audit[2729]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2614 pid=2729 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633353935663866356431633336393665316562343331656532336135 Dec 16 12:12:12.418000 audit: BPF prog-id=106 op=LOAD Dec 16 12:12:12.418000 audit[2729]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2614 pid=2729 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633353935663866356431633336393665316562343331656532336135 Dec 16 12:12:12.419000 audit: BPF prog-id=106 op=UNLOAD Dec 16 12:12:12.419000 audit[2729]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2614 pid=2729 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.419000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633353935663866356431633336393665316562343331656532336135 Dec 16 12:12:12.421000 audit: BPF prog-id=105 op=UNLOAD Dec 16 12:12:12.421000 audit[2729]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2614 pid=2729 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.421000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633353935663866356431633336393665316562343331656532336135 Dec 16 12:12:12.421000 audit: BPF prog-id=107 op=LOAD Dec 16 12:12:12.421000 audit[2729]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2614 pid=2729 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.421000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6633353935663866356431633336393665316562343331656532336135 Dec 16 12:12:12.429000 audit: BPF prog-id=108 op=LOAD Dec 16 12:12:12.429000 audit: BPF prog-id=109 op=LOAD Dec 16 12:12:12.429000 audit[2747]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2633 pid=2747 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.429000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532393465623465633936383637646630373538303438313634653362 Dec 16 12:12:12.430000 audit: BPF prog-id=109 op=UNLOAD Dec 16 12:12:12.430000 audit[2747]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2633 pid=2747 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.430000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532393465623465633936383637646630373538303438313634653362 Dec 16 12:12:12.430000 audit: BPF prog-id=110 op=LOAD Dec 16 12:12:12.430000 audit[2747]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2633 pid=2747 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.430000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532393465623465633936383637646630373538303438313634653362 Dec 16 12:12:12.430000 audit: BPF prog-id=111 op=LOAD Dec 16 12:12:12.430000 audit[2747]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2633 pid=2747 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.430000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532393465623465633936383637646630373538303438313634653362 Dec 16 12:12:12.430000 audit: BPF prog-id=111 op=UNLOAD Dec 16 12:12:12.430000 audit[2747]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2633 pid=2747 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.430000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532393465623465633936383637646630373538303438313634653362 Dec 16 12:12:12.430000 audit: BPF prog-id=110 op=UNLOAD Dec 16 12:12:12.430000 audit[2747]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2633 pid=2747 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.430000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532393465623465633936383637646630373538303438313634653362 Dec 16 12:12:12.430000 audit: BPF prog-id=112 op=LOAD Dec 16 12:12:12.430000 audit[2747]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2633 pid=2747 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:12.430000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3532393465623465633936383637646630373538303438313634653362 Dec 16 12:12:12.437806 containerd[1662]: time="2025-12-16T12:12:12.437773048Z" level=info msg="StartContainer for \"d8fda76280e91957933b5c537b46a2581e6bdb2c7d054be2f6ae8edc941b6d86\" returns successfully" Dec 16 12:12:12.464400 containerd[1662]: time="2025-12-16T12:12:12.464091961Z" level=info msg="StartContainer for \"f3595f8f5d1c3696e1eb431ee23a527d2d0ffd02b26cd21b85ac85b917f7282e\" returns successfully" Dec 16 12:12:12.465654 containerd[1662]: time="2025-12-16T12:12:12.465609805Z" level=info msg="StartContainer for \"5294eb4ec96867df0758048164e3b9ca486f504dc6ff6d548203471650c69433\" returns successfully" Dec 16 12:12:12.563395 kubelet[2523]: I1216 12:12:12.563364 2523 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:12.806809 kubelet[2523]: E1216 12:12:12.806705 2523 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-4-c6e23b3406\" not found" node="ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:12.809424 kubelet[2523]: E1216 12:12:12.809395 2523 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-4-c6e23b3406\" not found" node="ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:12.811015 kubelet[2523]: E1216 12:12:12.810992 2523 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-4-c6e23b3406\" not found" node="ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:13.814624 kubelet[2523]: E1216 12:12:13.814582 2523 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-4-c6e23b3406\" not found" node="ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:13.815115 kubelet[2523]: E1216 12:12:13.815077 2523 kubelet.go:3305] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4547-0-0-4-c6e23b3406\" not found" node="ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:14.193981 kubelet[2523]: E1216 12:12:14.193939 2523 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4547-0-0-4-c6e23b3406\" not found" node="ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:14.371903 kubelet[2523]: I1216 12:12:14.371843 2523 kubelet_node_status.go:78] "Successfully registered node" node="ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:14.449919 kubelet[2523]: I1216 12:12:14.449808 2523 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:14.455255 kubelet[2523]: E1216 12:12:14.455222 2523 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4547-0-0-4-c6e23b3406\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:14.455255 kubelet[2523]: I1216 12:12:14.455254 2523 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:14.456933 kubelet[2523]: E1216 12:12:14.456905 2523 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4547-0-0-4-c6e23b3406\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:14.456933 kubelet[2523]: I1216 12:12:14.456930 2523 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:14.458544 kubelet[2523]: E1216 12:12:14.458516 2523 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4547-0-0-4-c6e23b3406\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:14.726486 kubelet[2523]: I1216 12:12:14.726375 2523 apiserver.go:52] "Watching apiserver" Dec 16 12:12:14.749795 kubelet[2523]: I1216 12:12:14.749736 2523 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Dec 16 12:12:16.511704 systemd[1]: Reload requested from client PID 2825 ('systemctl') (unit session-10.scope)... Dec 16 12:12:16.511721 systemd[1]: Reloading... Dec 16 12:12:16.575686 zram_generator::config[2871]: No configuration found. Dec 16 12:12:16.753149 systemd[1]: Reloading finished in 241 ms. Dec 16 12:12:16.781793 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:12:16.795869 systemd[1]: kubelet.service: Deactivated successfully. Dec 16 12:12:16.796163 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:12:16.795000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:12:16.796236 systemd[1]: kubelet.service: Consumed 1.999s CPU time, 127.5M memory peak. Dec 16 12:12:16.796933 kernel: kauditd_printk_skb: 202 callbacks suppressed Dec 16 12:12:16.796997 kernel: audit: type=1131 audit(1765887136.795:394): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:12:16.798001 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:12:16.798000 audit: BPF prog-id=113 op=LOAD Dec 16 12:12:16.800128 kernel: audit: type=1334 audit(1765887136.798:395): prog-id=113 op=LOAD Dec 16 12:12:16.800180 kernel: audit: type=1334 audit(1765887136.798:396): prog-id=63 op=UNLOAD Dec 16 12:12:16.798000 audit: BPF prog-id=63 op=UNLOAD Dec 16 12:12:16.799000 audit: BPF prog-id=114 op=LOAD Dec 16 12:12:16.801690 kernel: audit: type=1334 audit(1765887136.799:397): prog-id=114 op=LOAD Dec 16 12:12:16.801726 kernel: audit: type=1334 audit(1765887136.800:398): prog-id=115 op=LOAD Dec 16 12:12:16.800000 audit: BPF prog-id=115 op=LOAD Dec 16 12:12:16.800000 audit: BPF prog-id=64 op=UNLOAD Dec 16 12:12:16.803235 kernel: audit: type=1334 audit(1765887136.800:399): prog-id=64 op=UNLOAD Dec 16 12:12:16.803265 kernel: audit: type=1334 audit(1765887136.800:400): prog-id=65 op=UNLOAD Dec 16 12:12:16.800000 audit: BPF prog-id=65 op=UNLOAD Dec 16 12:12:16.801000 audit: BPF prog-id=116 op=LOAD Dec 16 12:12:16.804797 kernel: audit: type=1334 audit(1765887136.801:401): prog-id=116 op=LOAD Dec 16 12:12:16.804832 kernel: audit: type=1334 audit(1765887136.801:402): prog-id=73 op=UNLOAD Dec 16 12:12:16.801000 audit: BPF prog-id=73 op=UNLOAD Dec 16 12:12:16.801000 audit: BPF prog-id=117 op=LOAD Dec 16 12:12:16.806279 kernel: audit: type=1334 audit(1765887136.801:403): prog-id=117 op=LOAD Dec 16 12:12:16.815000 audit: BPF prog-id=118 op=LOAD Dec 16 12:12:16.815000 audit: BPF prog-id=74 op=UNLOAD Dec 16 12:12:16.815000 audit: BPF prog-id=75 op=UNLOAD Dec 16 12:12:16.815000 audit: BPF prog-id=119 op=LOAD Dec 16 12:12:16.815000 audit: BPF prog-id=80 op=UNLOAD Dec 16 12:12:16.816000 audit: BPF prog-id=120 op=LOAD Dec 16 12:12:16.816000 audit: BPF prog-id=121 op=LOAD Dec 16 12:12:16.816000 audit: BPF prog-id=81 op=UNLOAD Dec 16 12:12:16.816000 audit: BPF prog-id=82 op=UNLOAD Dec 16 12:12:16.816000 audit: BPF prog-id=122 op=LOAD Dec 16 12:12:16.816000 audit: BPF prog-id=78 op=UNLOAD Dec 16 12:12:16.817000 audit: BPF prog-id=123 op=LOAD Dec 16 12:12:16.817000 audit: BPF prog-id=66 op=UNLOAD Dec 16 12:12:16.817000 audit: BPF prog-id=124 op=LOAD Dec 16 12:12:16.817000 audit: BPF prog-id=125 op=LOAD Dec 16 12:12:16.817000 audit: BPF prog-id=67 op=UNLOAD Dec 16 12:12:16.817000 audit: BPF prog-id=68 op=UNLOAD Dec 16 12:12:16.817000 audit: BPF prog-id=126 op=LOAD Dec 16 12:12:16.817000 audit: BPF prog-id=69 op=UNLOAD Dec 16 12:12:16.819000 audit: BPF prog-id=127 op=LOAD Dec 16 12:12:16.819000 audit: BPF prog-id=79 op=UNLOAD Dec 16 12:12:16.820000 audit: BPF prog-id=128 op=LOAD Dec 16 12:12:16.820000 audit: BPF prog-id=70 op=UNLOAD Dec 16 12:12:16.820000 audit: BPF prog-id=129 op=LOAD Dec 16 12:12:16.820000 audit: BPF prog-id=130 op=LOAD Dec 16 12:12:16.820000 audit: BPF prog-id=71 op=UNLOAD Dec 16 12:12:16.820000 audit: BPF prog-id=72 op=UNLOAD Dec 16 12:12:16.821000 audit: BPF prog-id=131 op=LOAD Dec 16 12:12:16.821000 audit: BPF prog-id=132 op=LOAD Dec 16 12:12:16.821000 audit: BPF prog-id=76 op=UNLOAD Dec 16 12:12:16.821000 audit: BPF prog-id=77 op=UNLOAD Dec 16 12:12:16.967478 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:12:16.967000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:12:16.984220 (kubelet)[2916]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 16 12:12:17.016203 kubelet[2916]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 12:12:17.016203 kubelet[2916]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 16 12:12:17.016203 kubelet[2916]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 12:12:17.016530 kubelet[2916]: I1216 12:12:17.016238 2916 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 12:12:17.024660 kubelet[2916]: I1216 12:12:17.024001 2916 server.go:530] "Kubelet version" kubeletVersion="v1.33.0" Dec 16 12:12:17.024660 kubelet[2916]: I1216 12:12:17.024030 2916 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 12:12:17.024660 kubelet[2916]: I1216 12:12:17.024246 2916 server.go:956] "Client rotation is on, will bootstrap in background" Dec 16 12:12:17.025675 kubelet[2916]: I1216 12:12:17.025655 2916 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Dec 16 12:12:17.028224 kubelet[2916]: I1216 12:12:17.028178 2916 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 12:12:17.033679 kubelet[2916]: I1216 12:12:17.033572 2916 server.go:1446] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 12:12:17.035961 kubelet[2916]: I1216 12:12:17.035936 2916 server.go:782] "--cgroups-per-qos enabled, but --cgroup-root was not specified. defaulting to /" Dec 16 12:12:17.036181 kubelet[2916]: I1216 12:12:17.036125 2916 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 12:12:17.036343 kubelet[2916]: I1216 12:12:17.036183 2916 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4547-0-0-4-c6e23b3406","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 12:12:17.036415 kubelet[2916]: I1216 12:12:17.036353 2916 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 12:12:17.036415 kubelet[2916]: I1216 12:12:17.036362 2916 container_manager_linux.go:303] "Creating device plugin manager" Dec 16 12:12:17.036415 kubelet[2916]: I1216 12:12:17.036401 2916 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:12:17.036564 kubelet[2916]: I1216 12:12:17.036553 2916 kubelet.go:480] "Attempting to sync node with API server" Dec 16 12:12:17.036585 kubelet[2916]: I1216 12:12:17.036568 2916 kubelet.go:375] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 12:12:17.036609 kubelet[2916]: I1216 12:12:17.036594 2916 kubelet.go:386] "Adding apiserver pod source" Dec 16 12:12:17.036609 kubelet[2916]: I1216 12:12:17.036608 2916 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 12:12:17.040639 kubelet[2916]: I1216 12:12:17.037470 2916 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Dec 16 12:12:17.040639 kubelet[2916]: I1216 12:12:17.038023 2916 kubelet.go:935] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 16 12:12:17.041770 kubelet[2916]: I1216 12:12:17.041749 2916 watchdog_linux.go:99] "Systemd watchdog is not enabled" Dec 16 12:12:17.041930 kubelet[2916]: I1216 12:12:17.041860 2916 server.go:1289] "Started kubelet" Dec 16 12:12:17.042810 kubelet[2916]: I1216 12:12:17.042782 2916 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 12:12:17.044646 kubelet[2916]: I1216 12:12:17.042950 2916 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 12:12:17.044646 kubelet[2916]: I1216 12:12:17.043221 2916 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 12:12:17.044646 kubelet[2916]: I1216 12:12:17.043616 2916 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 12:12:17.046638 kubelet[2916]: I1216 12:12:17.045484 2916 server.go:317] "Adding debug handlers to kubelet server" Dec 16 12:12:17.048518 kubelet[2916]: E1216 12:12:17.048455 2916 kubelet.go:1600] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 16 12:12:17.048717 kubelet[2916]: I1216 12:12:17.048695 2916 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 16 12:12:17.053448 kubelet[2916]: E1216 12:12:17.053315 2916 kubelet_node_status.go:466] "Error getting the current node from lister" err="node \"ci-4547-0-0-4-c6e23b3406\" not found" Dec 16 12:12:17.053448 kubelet[2916]: I1216 12:12:17.053376 2916 volume_manager.go:297] "Starting Kubelet Volume Manager" Dec 16 12:12:17.054151 kubelet[2916]: I1216 12:12:17.054111 2916 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Dec 16 12:12:17.054342 kubelet[2916]: I1216 12:12:17.054316 2916 reconciler.go:26] "Reconciler: start to sync state" Dec 16 12:12:17.054756 kubelet[2916]: I1216 12:12:17.054718 2916 factory.go:223] Registration of the systemd container factory successfully Dec 16 12:12:17.056650 kubelet[2916]: I1216 12:12:17.054877 2916 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 16 12:12:17.056650 kubelet[2916]: I1216 12:12:17.056175 2916 factory.go:223] Registration of the containerd container factory successfully Dec 16 12:12:17.061243 kubelet[2916]: I1216 12:12:17.061203 2916 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Dec 16 12:12:17.063213 kubelet[2916]: I1216 12:12:17.063188 2916 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Dec 16 12:12:17.063213 kubelet[2916]: I1216 12:12:17.063212 2916 status_manager.go:230] "Starting to sync pod status with apiserver" Dec 16 12:12:17.063297 kubelet[2916]: I1216 12:12:17.063242 2916 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 16 12:12:17.063297 kubelet[2916]: I1216 12:12:17.063249 2916 kubelet.go:2436] "Starting kubelet main sync loop" Dec 16 12:12:17.063297 kubelet[2916]: E1216 12:12:17.063290 2916 kubelet.go:2460] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 12:12:17.100386 kubelet[2916]: I1216 12:12:17.100362 2916 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 16 12:12:17.100533 kubelet[2916]: I1216 12:12:17.100518 2916 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 16 12:12:17.100586 kubelet[2916]: I1216 12:12:17.100579 2916 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:12:17.100783 kubelet[2916]: I1216 12:12:17.100760 2916 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 16 12:12:17.100871 kubelet[2916]: I1216 12:12:17.100849 2916 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 16 12:12:17.100922 kubelet[2916]: I1216 12:12:17.100914 2916 policy_none.go:49] "None policy: Start" Dec 16 12:12:17.100986 kubelet[2916]: I1216 12:12:17.100976 2916 memory_manager.go:186] "Starting memorymanager" policy="None" Dec 16 12:12:17.101038 kubelet[2916]: I1216 12:12:17.101031 2916 state_mem.go:35] "Initializing new in-memory state store" Dec 16 12:12:17.101180 kubelet[2916]: I1216 12:12:17.101166 2916 state_mem.go:75] "Updated machine memory state" Dec 16 12:12:17.104808 kubelet[2916]: E1216 12:12:17.104786 2916 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 16 12:12:17.105030 kubelet[2916]: I1216 12:12:17.105012 2916 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 12:12:17.105139 kubelet[2916]: I1216 12:12:17.105109 2916 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 12:12:17.105366 kubelet[2916]: I1216 12:12:17.105348 2916 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 12:12:17.106912 kubelet[2916]: E1216 12:12:17.106894 2916 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 16 12:12:17.165084 kubelet[2916]: I1216 12:12:17.165037 2916 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:17.165312 kubelet[2916]: I1216 12:12:17.165077 2916 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:17.165398 kubelet[2916]: I1216 12:12:17.165154 2916 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:17.208001 kubelet[2916]: I1216 12:12:17.207972 2916 kubelet_node_status.go:75] "Attempting to register node" node="ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:17.215658 kubelet[2916]: I1216 12:12:17.215492 2916 kubelet_node_status.go:124] "Node was previously registered" node="ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:17.215658 kubelet[2916]: I1216 12:12:17.215559 2916 kubelet_node_status.go:78] "Successfully registered node" node="ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:17.255987 kubelet[2916]: I1216 12:12:17.255952 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/ab47c6a6003114c72e61300e646e4520-ca-certs\") pod \"kube-apiserver-ci-4547-0-0-4-c6e23b3406\" (UID: \"ab47c6a6003114c72e61300e646e4520\") " pod="kube-system/kube-apiserver-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:17.255987 kubelet[2916]: I1216 12:12:17.255991 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/ab47c6a6003114c72e61300e646e4520-k8s-certs\") pod \"kube-apiserver-ci-4547-0-0-4-c6e23b3406\" (UID: \"ab47c6a6003114c72e61300e646e4520\") " pod="kube-system/kube-apiserver-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:17.256120 kubelet[2916]: I1216 12:12:17.256011 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/089c865778d94423bfd6c7ecfbcac378-ca-certs\") pod \"kube-controller-manager-ci-4547-0-0-4-c6e23b3406\" (UID: \"089c865778d94423bfd6c7ecfbcac378\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:17.256120 kubelet[2916]: I1216 12:12:17.256029 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/089c865778d94423bfd6c7ecfbcac378-flexvolume-dir\") pod \"kube-controller-manager-ci-4547-0-0-4-c6e23b3406\" (UID: \"089c865778d94423bfd6c7ecfbcac378\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:17.256120 kubelet[2916]: I1216 12:12:17.256082 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/089c865778d94423bfd6c7ecfbcac378-k8s-certs\") pod \"kube-controller-manager-ci-4547-0-0-4-c6e23b3406\" (UID: \"089c865778d94423bfd6c7ecfbcac378\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:17.256212 kubelet[2916]: I1216 12:12:17.256130 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/089c865778d94423bfd6c7ecfbcac378-kubeconfig\") pod \"kube-controller-manager-ci-4547-0-0-4-c6e23b3406\" (UID: \"089c865778d94423bfd6c7ecfbcac378\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:17.256212 kubelet[2916]: I1216 12:12:17.256148 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/089c865778d94423bfd6c7ecfbcac378-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4547-0-0-4-c6e23b3406\" (UID: \"089c865778d94423bfd6c7ecfbcac378\") " pod="kube-system/kube-controller-manager-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:17.256212 kubelet[2916]: I1216 12:12:17.256174 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/b96fec26af0eefb5af6b8a162b45a3e5-kubeconfig\") pod \"kube-scheduler-ci-4547-0-0-4-c6e23b3406\" (UID: \"b96fec26af0eefb5af6b8a162b45a3e5\") " pod="kube-system/kube-scheduler-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:17.256212 kubelet[2916]: I1216 12:12:17.256192 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/ab47c6a6003114c72e61300e646e4520-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4547-0-0-4-c6e23b3406\" (UID: \"ab47c6a6003114c72e61300e646e4520\") " pod="kube-system/kube-apiserver-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:18.037204 kubelet[2916]: I1216 12:12:18.037164 2916 apiserver.go:52] "Watching apiserver" Dec 16 12:12:18.054250 kubelet[2916]: I1216 12:12:18.054213 2916 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Dec 16 12:12:18.088193 kubelet[2916]: I1216 12:12:18.088160 2916 kubelet.go:3309] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:18.095608 kubelet[2916]: E1216 12:12:18.095096 2916 kubelet.go:3311] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4547-0-0-4-c6e23b3406\" already exists" pod="kube-system/kube-apiserver-ci-4547-0-0-4-c6e23b3406" Dec 16 12:12:18.115340 kubelet[2916]: I1216 12:12:18.115127 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4547-0-0-4-c6e23b3406" podStartSLOduration=1.115111377 podStartE2EDuration="1.115111377s" podCreationTimestamp="2025-12-16 12:12:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:12:18.106994714 +0000 UTC m=+1.119306637" watchObservedRunningTime="2025-12-16 12:12:18.115111377 +0000 UTC m=+1.127423300" Dec 16 12:12:18.124124 kubelet[2916]: I1216 12:12:18.124072 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4547-0-0-4-c6e23b3406" podStartSLOduration=1.124056282 podStartE2EDuration="1.124056282s" podCreationTimestamp="2025-12-16 12:12:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:12:18.115911259 +0000 UTC m=+1.128223222" watchObservedRunningTime="2025-12-16 12:12:18.124056282 +0000 UTC m=+1.136368205" Dec 16 12:12:18.133675 kubelet[2916]: I1216 12:12:18.133422 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4547-0-0-4-c6e23b3406" podStartSLOduration=1.133407748 podStartE2EDuration="1.133407748s" podCreationTimestamp="2025-12-16 12:12:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:12:18.124343203 +0000 UTC m=+1.136655126" watchObservedRunningTime="2025-12-16 12:12:18.133407748 +0000 UTC m=+1.145719671" Dec 16 12:12:23.599209 kubelet[2916]: I1216 12:12:23.599169 2916 kuberuntime_manager.go:1746] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Dec 16 12:12:23.599533 containerd[1662]: time="2025-12-16T12:12:23.599495526Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Dec 16 12:12:23.599745 kubelet[2916]: I1216 12:12:23.599715 2916 kubelet_network.go:61] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Dec 16 12:12:24.772710 systemd[1]: Created slice kubepods-besteffort-pod6a27f997_887a_4a8e_9cd0_07d7a9b931a0.slice - libcontainer container kubepods-besteffort-pod6a27f997_887a_4a8e_9cd0_07d7a9b931a0.slice. Dec 16 12:12:24.808059 kubelet[2916]: I1216 12:12:24.808027 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/6a27f997-887a-4a8e-9cd0-07d7a9b931a0-xtables-lock\") pod \"kube-proxy-qsw6m\" (UID: \"6a27f997-887a-4a8e-9cd0-07d7a9b931a0\") " pod="kube-system/kube-proxy-qsw6m" Dec 16 12:12:24.808648 kubelet[2916]: I1216 12:12:24.808522 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6a27f997-887a-4a8e-9cd0-07d7a9b931a0-lib-modules\") pod \"kube-proxy-qsw6m\" (UID: \"6a27f997-887a-4a8e-9cd0-07d7a9b931a0\") " pod="kube-system/kube-proxy-qsw6m" Dec 16 12:12:24.808648 kubelet[2916]: I1216 12:12:24.808555 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dqwh5\" (UniqueName: \"kubernetes.io/projected/6a27f997-887a-4a8e-9cd0-07d7a9b931a0-kube-api-access-dqwh5\") pod \"kube-proxy-qsw6m\" (UID: \"6a27f997-887a-4a8e-9cd0-07d7a9b931a0\") " pod="kube-system/kube-proxy-qsw6m" Dec 16 12:12:24.808648 kubelet[2916]: I1216 12:12:24.808578 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/6a27f997-887a-4a8e-9cd0-07d7a9b931a0-kube-proxy\") pod \"kube-proxy-qsw6m\" (UID: \"6a27f997-887a-4a8e-9cd0-07d7a9b931a0\") " pod="kube-system/kube-proxy-qsw6m" Dec 16 12:12:24.893548 systemd[1]: Created slice kubepods-besteffort-pod561879bb_4f13_4ab8_8d45_be9ae9ec7afc.slice - libcontainer container kubepods-besteffort-pod561879bb_4f13_4ab8_8d45_be9ae9ec7afc.slice. Dec 16 12:12:24.909355 kubelet[2916]: I1216 12:12:24.909308 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/561879bb-4f13-4ab8-8d45-be9ae9ec7afc-var-lib-calico\") pod \"tigera-operator-7dcd859c48-m687x\" (UID: \"561879bb-4f13-4ab8-8d45-be9ae9ec7afc\") " pod="tigera-operator/tigera-operator-7dcd859c48-m687x" Dec 16 12:12:24.909513 kubelet[2916]: I1216 12:12:24.909384 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4ltf\" (UniqueName: \"kubernetes.io/projected/561879bb-4f13-4ab8-8d45-be9ae9ec7afc-kube-api-access-l4ltf\") pod \"tigera-operator-7dcd859c48-m687x\" (UID: \"561879bb-4f13-4ab8-8d45-be9ae9ec7afc\") " pod="tigera-operator/tigera-operator-7dcd859c48-m687x" Dec 16 12:12:25.089944 containerd[1662]: time="2025-12-16T12:12:25.089848197Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-qsw6m,Uid:6a27f997-887a-4a8e-9cd0-07d7a9b931a0,Namespace:kube-system,Attempt:0,}" Dec 16 12:12:25.108222 containerd[1662]: time="2025-12-16T12:12:25.108181328Z" level=info msg="connecting to shim 9901cb4e85e63a4d9a5a6a0bf792269060fdfe96202bbbe936035459fff77142" address="unix:///run/containerd/s/27bc3c3d744bc22735ade816082090c27d56916d0eedfbb14e463374662028b9" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:12:25.140908 systemd[1]: Started cri-containerd-9901cb4e85e63a4d9a5a6a0bf792269060fdfe96202bbbe936035459fff77142.scope - libcontainer container 9901cb4e85e63a4d9a5a6a0bf792269060fdfe96202bbbe936035459fff77142. Dec 16 12:12:25.149000 audit: BPF prog-id=133 op=LOAD Dec 16 12:12:25.151300 kernel: kauditd_printk_skb: 32 callbacks suppressed Dec 16 12:12:25.151344 kernel: audit: type=1334 audit(1765887145.149:436): prog-id=133 op=LOAD Dec 16 12:12:25.150000 audit: BPF prog-id=134 op=LOAD Dec 16 12:12:25.152918 kernel: audit: type=1334 audit(1765887145.150:437): prog-id=134 op=LOAD Dec 16 12:12:25.152949 kernel: audit: type=1300 audit(1765887145.150:437): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2984 pid=2995 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.150000 audit[2995]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2984 pid=2995 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.156333 kernel: audit: type=1327 audit(1765887145.150:437): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939303163623465383565363361346439613561366130626637393232 Dec 16 12:12:25.150000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939303163623465383565363361346439613561366130626637393232 Dec 16 12:12:25.151000 audit: BPF prog-id=134 op=UNLOAD Dec 16 12:12:25.160397 kernel: audit: type=1334 audit(1765887145.151:438): prog-id=134 op=UNLOAD Dec 16 12:12:25.160445 kernel: audit: type=1300 audit(1765887145.151:438): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2984 pid=2995 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.151000 audit[2995]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2984 pid=2995 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.151000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939303163623465383565363361346439613561366130626637393232 Dec 16 12:12:25.167079 kernel: audit: type=1327 audit(1765887145.151:438): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939303163623465383565363361346439613561366130626637393232 Dec 16 12:12:25.167184 kernel: audit: type=1334 audit(1765887145.151:439): prog-id=135 op=LOAD Dec 16 12:12:25.151000 audit: BPF prog-id=135 op=LOAD Dec 16 12:12:25.167910 kernel: audit: type=1300 audit(1765887145.151:439): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2984 pid=2995 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.151000 audit[2995]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2984 pid=2995 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.151000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939303163623465383565363361346439613561366130626637393232 Dec 16 12:12:25.174453 kernel: audit: type=1327 audit(1765887145.151:439): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939303163623465383565363361346439613561366130626637393232 Dec 16 12:12:25.152000 audit: BPF prog-id=136 op=LOAD Dec 16 12:12:25.152000 audit[2995]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2984 pid=2995 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.152000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939303163623465383565363361346439613561366130626637393232 Dec 16 12:12:25.155000 audit: BPF prog-id=136 op=UNLOAD Dec 16 12:12:25.155000 audit[2995]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2984 pid=2995 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.155000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939303163623465383565363361346439613561366130626637393232 Dec 16 12:12:25.155000 audit: BPF prog-id=135 op=UNLOAD Dec 16 12:12:25.155000 audit[2995]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2984 pid=2995 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.155000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939303163623465383565363361346439613561366130626637393232 Dec 16 12:12:25.155000 audit: BPF prog-id=137 op=LOAD Dec 16 12:12:25.155000 audit[2995]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2984 pid=2995 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.155000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939303163623465383565363361346439613561366130626637393232 Dec 16 12:12:25.185947 containerd[1662]: time="2025-12-16T12:12:25.185910744Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-qsw6m,Uid:6a27f997-887a-4a8e-9cd0-07d7a9b931a0,Namespace:kube-system,Attempt:0,} returns sandbox id \"9901cb4e85e63a4d9a5a6a0bf792269060fdfe96202bbbe936035459fff77142\"" Dec 16 12:12:25.191863 containerd[1662]: time="2025-12-16T12:12:25.191723721Z" level=info msg="CreateContainer within sandbox \"9901cb4e85e63a4d9a5a6a0bf792269060fdfe96202bbbe936035459fff77142\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Dec 16 12:12:25.196789 containerd[1662]: time="2025-12-16T12:12:25.196762495Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-m687x,Uid:561879bb-4f13-4ab8-8d45-be9ae9ec7afc,Namespace:tigera-operator,Attempt:0,}" Dec 16 12:12:25.202152 containerd[1662]: time="2025-12-16T12:12:25.202115510Z" level=info msg="Container 501b1bd76f8952c7f37858f60fff8b6dc3bec8e70661121a28874eeacd016407: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:12:25.210936 containerd[1662]: time="2025-12-16T12:12:25.210901694Z" level=info msg="CreateContainer within sandbox \"9901cb4e85e63a4d9a5a6a0bf792269060fdfe96202bbbe936035459fff77142\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"501b1bd76f8952c7f37858f60fff8b6dc3bec8e70661121a28874eeacd016407\"" Dec 16 12:12:25.211465 containerd[1662]: time="2025-12-16T12:12:25.211437616Z" level=info msg="StartContainer for \"501b1bd76f8952c7f37858f60fff8b6dc3bec8e70661121a28874eeacd016407\"" Dec 16 12:12:25.213256 containerd[1662]: time="2025-12-16T12:12:25.213220900Z" level=info msg="connecting to shim 501b1bd76f8952c7f37858f60fff8b6dc3bec8e70661121a28874eeacd016407" address="unix:///run/containerd/s/27bc3c3d744bc22735ade816082090c27d56916d0eedfbb14e463374662028b9" protocol=ttrpc version=3 Dec 16 12:12:25.221191 containerd[1662]: time="2025-12-16T12:12:25.221106082Z" level=info msg="connecting to shim a1b53623f700cdcb3a3c2e891eae7a11e5b6419fdcf7a97c964cb547fc1958f6" address="unix:///run/containerd/s/976dd9f4f6eac049d9ad1dfc2bfa02cad45e04cc04c56b7aa554dfda47ae85ac" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:12:25.231835 systemd[1]: Started cri-containerd-501b1bd76f8952c7f37858f60fff8b6dc3bec8e70661121a28874eeacd016407.scope - libcontainer container 501b1bd76f8952c7f37858f60fff8b6dc3bec8e70661121a28874eeacd016407. Dec 16 12:12:25.248810 systemd[1]: Started cri-containerd-a1b53623f700cdcb3a3c2e891eae7a11e5b6419fdcf7a97c964cb547fc1958f6.scope - libcontainer container a1b53623f700cdcb3a3c2e891eae7a11e5b6419fdcf7a97c964cb547fc1958f6. Dec 16 12:12:25.257000 audit: BPF prog-id=138 op=LOAD Dec 16 12:12:25.258000 audit: BPF prog-id=139 op=LOAD Dec 16 12:12:25.258000 audit[3053]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3036 pid=3053 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.258000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6131623533363233663730306364636233613363326538393165616537 Dec 16 12:12:25.258000 audit: BPF prog-id=139 op=UNLOAD Dec 16 12:12:25.258000 audit[3053]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3036 pid=3053 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.258000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6131623533363233663730306364636233613363326538393165616537 Dec 16 12:12:25.258000 audit: BPF prog-id=140 op=LOAD Dec 16 12:12:25.258000 audit[3053]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3036 pid=3053 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.258000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6131623533363233663730306364636233613363326538393165616537 Dec 16 12:12:25.258000 audit: BPF prog-id=141 op=LOAD Dec 16 12:12:25.258000 audit[3053]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3036 pid=3053 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.258000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6131623533363233663730306364636233613363326538393165616537 Dec 16 12:12:25.258000 audit: BPF prog-id=141 op=UNLOAD Dec 16 12:12:25.258000 audit[3053]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3036 pid=3053 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.258000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6131623533363233663730306364636233613363326538393165616537 Dec 16 12:12:25.258000 audit: BPF prog-id=140 op=UNLOAD Dec 16 12:12:25.258000 audit[3053]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3036 pid=3053 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.258000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6131623533363233663730306364636233613363326538393165616537 Dec 16 12:12:25.258000 audit: BPF prog-id=142 op=LOAD Dec 16 12:12:25.258000 audit[3053]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3036 pid=3053 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.258000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6131623533363233663730306364636233613363326538393165616537 Dec 16 12:12:25.275000 audit: BPF prog-id=143 op=LOAD Dec 16 12:12:25.275000 audit[3022]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2984 pid=3022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.275000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3530316231626437366638393532633766333738353866363066666638 Dec 16 12:12:25.275000 audit: BPF prog-id=144 op=LOAD Dec 16 12:12:25.275000 audit[3022]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2984 pid=3022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.275000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3530316231626437366638393532633766333738353866363066666638 Dec 16 12:12:25.275000 audit: BPF prog-id=144 op=UNLOAD Dec 16 12:12:25.275000 audit[3022]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2984 pid=3022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.275000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3530316231626437366638393532633766333738353866363066666638 Dec 16 12:12:25.275000 audit: BPF prog-id=143 op=UNLOAD Dec 16 12:12:25.275000 audit[3022]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2984 pid=3022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.275000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3530316231626437366638393532633766333738353866363066666638 Dec 16 12:12:25.275000 audit: BPF prog-id=145 op=LOAD Dec 16 12:12:25.275000 audit[3022]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2984 pid=3022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.275000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3530316231626437366638393532633766333738353866363066666638 Dec 16 12:12:25.284664 containerd[1662]: time="2025-12-16T12:12:25.284609419Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-7dcd859c48-m687x,Uid:561879bb-4f13-4ab8-8d45-be9ae9ec7afc,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"a1b53623f700cdcb3a3c2e891eae7a11e5b6419fdcf7a97c964cb547fc1958f6\"" Dec 16 12:12:25.285955 containerd[1662]: time="2025-12-16T12:12:25.285917663Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Dec 16 12:12:25.297410 containerd[1662]: time="2025-12-16T12:12:25.297310855Z" level=info msg="StartContainer for \"501b1bd76f8952c7f37858f60fff8b6dc3bec8e70661121a28874eeacd016407\" returns successfully" Dec 16 12:12:25.445000 audit[3132]: NETFILTER_CFG table=mangle:54 family=10 entries=1 op=nft_register_chain pid=3132 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:25.445000 audit[3132]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffd117b870 a2=0 a3=1 items=0 ppid=3060 pid=3132 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.445000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Dec 16 12:12:25.448000 audit[3136]: NETFILTER_CFG table=nat:55 family=10 entries=1 op=nft_register_chain pid=3136 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:25.448000 audit[3136]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffefb9a680 a2=0 a3=1 items=0 ppid=3060 pid=3136 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.448000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Dec 16 12:12:25.448000 audit[3133]: NETFILTER_CFG table=mangle:56 family=2 entries=1 op=nft_register_chain pid=3133 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:25.448000 audit[3133]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff4671f70 a2=0 a3=1 items=0 ppid=3060 pid=3133 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.448000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Dec 16 12:12:25.450000 audit[3138]: NETFILTER_CFG table=nat:57 family=2 entries=1 op=nft_register_chain pid=3138 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:25.450000 audit[3138]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd06a81c0 a2=0 a3=1 items=0 ppid=3060 pid=3138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.450000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Dec 16 12:12:25.451000 audit[3140]: NETFILTER_CFG table=filter:58 family=10 entries=1 op=nft_register_chain pid=3140 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:25.451000 audit[3140]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff00d6bd0 a2=0 a3=1 items=0 ppid=3060 pid=3140 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.451000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Dec 16 12:12:25.454000 audit[3141]: NETFILTER_CFG table=filter:59 family=2 entries=1 op=nft_register_chain pid=3141 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:25.454000 audit[3141]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc9624180 a2=0 a3=1 items=0 ppid=3060 pid=3141 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.454000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Dec 16 12:12:25.550000 audit[3144]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3144 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:25.550000 audit[3144]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=ffffe552b310 a2=0 a3=1 items=0 ppid=3060 pid=3144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.550000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Dec 16 12:12:25.553000 audit[3146]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3146 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:25.553000 audit[3146]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffd71fcdd0 a2=0 a3=1 items=0 ppid=3060 pid=3146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.553000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276696365 Dec 16 12:12:25.558000 audit[3149]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3149 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:25.558000 audit[3149]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffdd993580 a2=0 a3=1 items=0 ppid=3060 pid=3149 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.558000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669 Dec 16 12:12:25.559000 audit[3150]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3150 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:25.559000 audit[3150]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffdf4d0670 a2=0 a3=1 items=0 ppid=3060 pid=3150 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.559000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Dec 16 12:12:25.563000 audit[3152]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3152 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:25.563000 audit[3152]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffd0475ce0 a2=0 a3=1 items=0 ppid=3060 pid=3152 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.563000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Dec 16 12:12:25.564000 audit[3153]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3153 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:25.564000 audit[3153]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd9494830 a2=0 a3=1 items=0 ppid=3060 pid=3153 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.564000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Dec 16 12:12:25.567000 audit[3155]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3155 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:25.567000 audit[3155]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffe325b030 a2=0 a3=1 items=0 ppid=3060 pid=3155 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.567000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Dec 16 12:12:25.571000 audit[3158]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3158 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:25.571000 audit[3158]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffd39e1670 a2=0 a3=1 items=0 ppid=3060 pid=3158 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.571000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D53 Dec 16 12:12:25.572000 audit[3159]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3159 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:25.572000 audit[3159]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc3c91c10 a2=0 a3=1 items=0 ppid=3060 pid=3159 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.572000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Dec 16 12:12:25.574000 audit[3161]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3161 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:25.574000 audit[3161]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffebc29fe0 a2=0 a3=1 items=0 ppid=3060 pid=3161 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.574000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Dec 16 12:12:25.575000 audit[3162]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3162 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:25.575000 audit[3162]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffef2a0ec0 a2=0 a3=1 items=0 ppid=3060 pid=3162 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.575000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Dec 16 12:12:25.578000 audit[3164]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3164 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:25.578000 audit[3164]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffdd627f90 a2=0 a3=1 items=0 ppid=3060 pid=3164 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.578000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Dec 16 12:12:25.581000 audit[3167]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3167 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:25.581000 audit[3167]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffcd56bcd0 a2=0 a3=1 items=0 ppid=3060 pid=3167 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.581000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Dec 16 12:12:25.585000 audit[3170]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3170 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:25.585000 audit[3170]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffdf8ca250 a2=0 a3=1 items=0 ppid=3060 pid=3170 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.585000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Dec 16 12:12:25.586000 audit[3171]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3171 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:25.586000 audit[3171]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffce53d170 a2=0 a3=1 items=0 ppid=3060 pid=3171 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.586000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Dec 16 12:12:25.588000 audit[3173]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3173 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:25.588000 audit[3173]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=ffffe553b800 a2=0 a3=1 items=0 ppid=3060 pid=3173 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.588000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:12:25.592000 audit[3176]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3176 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:25.592000 audit[3176]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=fffff60e3360 a2=0 a3=1 items=0 ppid=3060 pid=3176 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.592000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:12:25.593000 audit[3177]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3177 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:25.593000 audit[3177]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff51edc10 a2=0 a3=1 items=0 ppid=3060 pid=3177 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.593000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Dec 16 12:12:25.595000 audit[3179]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3179 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:12:25.595000 audit[3179]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=532 a0=3 a1=fffff7283330 a2=0 a3=1 items=0 ppid=3060 pid=3179 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.595000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Dec 16 12:12:25.619000 audit[3185]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3185 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:12:25.619000 audit[3185]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=fffff01637e0 a2=0 a3=1 items=0 ppid=3060 pid=3185 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.619000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:12:25.638000 audit[3185]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3185 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:12:25.638000 audit[3185]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5508 a0=3 a1=fffff01637e0 a2=0 a3=1 items=0 ppid=3060 pid=3185 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.638000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:12:25.640000 audit[3190]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3190 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:25.640000 audit[3190]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=fffffcfef4b0 a2=0 a3=1 items=0 ppid=3060 pid=3190 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.640000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Dec 16 12:12:25.643000 audit[3192]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3192 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:25.643000 audit[3192]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=836 a0=3 a1=fffffed89590 a2=0 a3=1 items=0 ppid=3060 pid=3192 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.643000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C6520736572766963 Dec 16 12:12:25.647000 audit[3195]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3195 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:25.647000 audit[3195]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffdbc2b4e0 a2=0 a3=1 items=0 ppid=3060 pid=3195 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.647000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C652073657276 Dec 16 12:12:25.648000 audit[3196]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3196 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:25.648000 audit[3196]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff1f82970 a2=0 a3=1 items=0 ppid=3060 pid=3196 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.648000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Dec 16 12:12:25.650000 audit[3198]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3198 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:25.650000 audit[3198]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffc5bcb580 a2=0 a3=1 items=0 ppid=3060 pid=3198 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.650000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Dec 16 12:12:25.651000 audit[3199]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3199 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:25.651000 audit[3199]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffec356a50 a2=0 a3=1 items=0 ppid=3060 pid=3199 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.651000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D740066696C746572 Dec 16 12:12:25.654000 audit[3201]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3201 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:25.654000 audit[3201]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=fffffe082fc0 a2=0 a3=1 items=0 ppid=3060 pid=3201 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.654000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B554245 Dec 16 12:12:25.657000 audit[3204]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3204 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:25.657000 audit[3204]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=828 a0=3 a1=ffffc6ef2f80 a2=0 a3=1 items=0 ppid=3060 pid=3204 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.657000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D Dec 16 12:12:25.659000 audit[3205]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3205 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:25.659000 audit[3205]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc86bc5b0 a2=0 a3=1 items=0 ppid=3060 pid=3205 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.659000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D464F5257415244002D740066696C746572 Dec 16 12:12:25.661000 audit[3207]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3207 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:25.661000 audit[3207]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffe3a03ad0 a2=0 a3=1 items=0 ppid=3060 pid=3207 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.661000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Dec 16 12:12:25.662000 audit[3208]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3208 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:25.662000 audit[3208]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffd3e49880 a2=0 a3=1 items=0 ppid=3060 pid=3208 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.662000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Dec 16 12:12:25.665000 audit[3210]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3210 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:25.665000 audit[3210]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffff50c410 a2=0 a3=1 items=0 ppid=3060 pid=3210 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.665000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A Dec 16 12:12:25.668000 audit[3213]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3213 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:25.668000 audit[3213]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffffbf11da0 a2=0 a3=1 items=0 ppid=3060 pid=3213 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.668000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D Dec 16 12:12:25.671000 audit[3216]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3216 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:25.671000 audit[3216]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffd6a6ed40 a2=0 a3=1 items=0 ppid=3060 pid=3216 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.671000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C Dec 16 12:12:25.672000 audit[3217]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3217 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:25.672000 audit[3217]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffc1a39340 a2=0 a3=1 items=0 ppid=3060 pid=3217 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.672000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D5345525649434553002D74006E6174 Dec 16 12:12:25.675000 audit[3219]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3219 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:25.675000 audit[3219]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=ffffff7b20a0 a2=0 a3=1 items=0 ppid=3060 pid=3219 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.675000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:12:25.678000 audit[3222]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3222 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:25.678000 audit[3222]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffe0ba7e20 a2=0 a3=1 items=0 ppid=3060 pid=3222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.678000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:12:25.679000 audit[3223]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3223 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:25.679000 audit[3223]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffffd5e3050 a2=0 a3=1 items=0 ppid=3060 pid=3223 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.679000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Dec 16 12:12:25.681000 audit[3225]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3225 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:25.681000 audit[3225]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=612 a0=3 a1=ffffcdac46c0 a2=0 a3=1 items=0 ppid=3060 pid=3225 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.681000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Dec 16 12:12:25.682000 audit[3226]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3226 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:25.682000 audit[3226]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe0059ee0 a2=0 a3=1 items=0 ppid=3060 pid=3226 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.682000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4E004B5542452D4649524557414C4C002D740066696C746572 Dec 16 12:12:25.685000 audit[3228]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3228 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:25.685000 audit[3228]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffed77d390 a2=0 a3=1 items=0 ppid=3060 pid=3228 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.685000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:12:25.689000 audit[3231]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3231 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:12:25.689000 audit[3231]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffeea5e390 a2=0 a3=1 items=0 ppid=3060 pid=3231 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.689000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D5700313030303030002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:12:25.692000 audit[3233]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3233 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Dec 16 12:12:25.692000 audit[3233]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2088 a0=3 a1=ffffe6152b80 a2=0 a3=1 items=0 ppid=3060 pid=3233 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.692000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:12:25.693000 audit[3233]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3233 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Dec 16 12:12:25.693000 audit[3233]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2056 a0=3 a1=ffffe6152b80 a2=0 a3=1 items=0 ppid=3060 pid=3233 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:25.693000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:12:26.120837 kubelet[2916]: I1216 12:12:26.120779 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-qsw6m" podStartSLOduration=2.120762788 podStartE2EDuration="2.120762788s" podCreationTimestamp="2025-12-16 12:12:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:12:26.120703468 +0000 UTC m=+9.133015391" watchObservedRunningTime="2025-12-16 12:12:26.120762788 +0000 UTC m=+9.133074711" Dec 16 12:12:27.180744 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1392636227.mount: Deactivated successfully. Dec 16 12:12:28.057036 containerd[1662]: time="2025-12-16T12:12:28.056983020Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:12:28.058174 containerd[1662]: time="2025-12-16T12:12:28.057975583Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=20773434" Dec 16 12:12:28.059669 containerd[1662]: time="2025-12-16T12:12:28.059641587Z" level=info msg="ImageCreate event name:\"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:12:28.062972 containerd[1662]: time="2025-12-16T12:12:28.062946917Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:12:28.063894 containerd[1662]: time="2025-12-16T12:12:28.063864359Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"22147999\" in 2.777900696s" Dec 16 12:12:28.063953 containerd[1662]: time="2025-12-16T12:12:28.063895079Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\"" Dec 16 12:12:28.072751 containerd[1662]: time="2025-12-16T12:12:28.072720304Z" level=info msg="CreateContainer within sandbox \"a1b53623f700cdcb3a3c2e891eae7a11e5b6419fdcf7a97c964cb547fc1958f6\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Dec 16 12:12:28.080809 containerd[1662]: time="2025-12-16T12:12:28.080761046Z" level=info msg="Container 56498aec7695fddc8d0bb0d480484e8d4e90812a5bb1505d4dd297a803af1235: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:12:28.091490 containerd[1662]: time="2025-12-16T12:12:28.091377676Z" level=info msg="CreateContainer within sandbox \"a1b53623f700cdcb3a3c2e891eae7a11e5b6419fdcf7a97c964cb547fc1958f6\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"56498aec7695fddc8d0bb0d480484e8d4e90812a5bb1505d4dd297a803af1235\"" Dec 16 12:12:28.092622 containerd[1662]: time="2025-12-16T12:12:28.091973237Z" level=info msg="StartContainer for \"56498aec7695fddc8d0bb0d480484e8d4e90812a5bb1505d4dd297a803af1235\"" Dec 16 12:12:28.092950 containerd[1662]: time="2025-12-16T12:12:28.092926880Z" level=info msg="connecting to shim 56498aec7695fddc8d0bb0d480484e8d4e90812a5bb1505d4dd297a803af1235" address="unix:///run/containerd/s/976dd9f4f6eac049d9ad1dfc2bfa02cad45e04cc04c56b7aa554dfda47ae85ac" protocol=ttrpc version=3 Dec 16 12:12:28.116347 systemd[1]: Started cri-containerd-56498aec7695fddc8d0bb0d480484e8d4e90812a5bb1505d4dd297a803af1235.scope - libcontainer container 56498aec7695fddc8d0bb0d480484e8d4e90812a5bb1505d4dd297a803af1235. Dec 16 12:12:28.131000 audit: BPF prog-id=146 op=LOAD Dec 16 12:12:28.131000 audit: BPF prog-id=147 op=LOAD Dec 16 12:12:28.131000 audit[3242]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3036 pid=3242 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:28.131000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3536343938616563373639356664646338643062623064343830343834 Dec 16 12:12:28.131000 audit: BPF prog-id=147 op=UNLOAD Dec 16 12:12:28.131000 audit[3242]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3036 pid=3242 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:28.131000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3536343938616563373639356664646338643062623064343830343834 Dec 16 12:12:28.131000 audit: BPF prog-id=148 op=LOAD Dec 16 12:12:28.131000 audit[3242]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3036 pid=3242 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:28.131000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3536343938616563373639356664646338643062623064343830343834 Dec 16 12:12:28.131000 audit: BPF prog-id=149 op=LOAD Dec 16 12:12:28.131000 audit[3242]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3036 pid=3242 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:28.131000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3536343938616563373639356664646338643062623064343830343834 Dec 16 12:12:28.131000 audit: BPF prog-id=149 op=UNLOAD Dec 16 12:12:28.131000 audit[3242]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3036 pid=3242 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:28.131000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3536343938616563373639356664646338643062623064343830343834 Dec 16 12:12:28.131000 audit: BPF prog-id=148 op=UNLOAD Dec 16 12:12:28.131000 audit[3242]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3036 pid=3242 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:28.131000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3536343938616563373639356664646338643062623064343830343834 Dec 16 12:12:28.131000 audit: BPF prog-id=150 op=LOAD Dec 16 12:12:28.131000 audit[3242]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3036 pid=3242 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:28.131000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3536343938616563373639356664646338643062623064343830343834 Dec 16 12:12:28.148198 containerd[1662]: time="2025-12-16T12:12:28.147982153Z" level=info msg="StartContainer for \"56498aec7695fddc8d0bb0d480484e8d4e90812a5bb1505d4dd297a803af1235\" returns successfully" Dec 16 12:12:29.126033 kubelet[2916]: I1216 12:12:29.125970 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-7dcd859c48-m687x" podStartSLOduration=2.346967778 podStartE2EDuration="5.125953077s" podCreationTimestamp="2025-12-16 12:12:24 +0000 UTC" firstStartedPulling="2025-12-16 12:12:25.285560542 +0000 UTC m=+8.297872465" lastFinishedPulling="2025-12-16 12:12:28.064545841 +0000 UTC m=+11.076857764" observedRunningTime="2025-12-16 12:12:29.125845397 +0000 UTC m=+12.138157320" watchObservedRunningTime="2025-12-16 12:12:29.125953077 +0000 UTC m=+12.138265000" Dec 16 12:12:33.346846 sudo[1961]: pam_unix(sudo:session): session closed for user root Dec 16 12:12:33.346000 audit[1961]: USER_END pid=1961 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:12:33.350372 kernel: kauditd_printk_skb: 224 callbacks suppressed Dec 16 12:12:33.350409 kernel: audit: type=1106 audit(1765887153.346:516): pid=1961 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:12:33.346000 audit[1961]: CRED_DISP pid=1961 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:12:33.354232 kernel: audit: type=1104 audit(1765887153.346:517): pid=1961 uid=500 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:12:33.517535 sshd[1960]: Connection closed by 139.178.68.195 port 36766 Dec 16 12:12:33.517376 sshd-session[1956]: pam_unix(sshd:session): session closed for user core Dec 16 12:12:33.519000 audit[1956]: USER_END pid=1956 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:12:33.522910 systemd[1]: sshd@8-10.0.21.106:22-139.178.68.195:36766.service: Deactivated successfully. Dec 16 12:12:33.528809 kernel: audit: type=1106 audit(1765887153.519:518): pid=1956 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:12:33.528859 kernel: audit: type=1104 audit(1765887153.519:519): pid=1956 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:12:33.519000 audit[1956]: CRED_DISP pid=1956 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:12:33.525000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.21.106:22-139.178.68.195:36766 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:12:33.531824 kernel: audit: type=1131 audit(1765887153.525:520): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-10.0.21.106:22-139.178.68.195:36766 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:12:33.529974 systemd[1]: session-10.scope: Deactivated successfully. Dec 16 12:12:33.530203 systemd[1]: session-10.scope: Consumed 6.944s CPU time, 221.2M memory peak. Dec 16 12:12:33.533137 systemd-logind[1644]: Session 10 logged out. Waiting for processes to exit. Dec 16 12:12:33.534475 systemd-logind[1644]: Removed session 10. Dec 16 12:12:34.832000 audit[3330]: NETFILTER_CFG table=filter:105 family=2 entries=14 op=nft_register_rule pid=3330 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:12:34.832000 audit[3330]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=fffff21e1ca0 a2=0 a3=1 items=0 ppid=3060 pid=3330 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:34.839636 kernel: audit: type=1325 audit(1765887154.832:521): table=filter:105 family=2 entries=14 op=nft_register_rule pid=3330 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:12:34.839697 kernel: audit: type=1300 audit(1765887154.832:521): arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=fffff21e1ca0 a2=0 a3=1 items=0 ppid=3060 pid=3330 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:34.832000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:12:34.842000 audit[3330]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3330 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:12:34.845516 kernel: audit: type=1327 audit(1765887154.832:521): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:12:34.845569 kernel: audit: type=1325 audit(1765887154.842:522): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3330 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:12:34.842000 audit[3330]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff21e1ca0 a2=0 a3=1 items=0 ppid=3060 pid=3330 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:34.850179 kernel: audit: type=1300 audit(1765887154.842:522): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff21e1ca0 a2=0 a3=1 items=0 ppid=3060 pid=3330 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:34.842000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:12:34.855000 audit[3332]: NETFILTER_CFG table=filter:107 family=2 entries=15 op=nft_register_rule pid=3332 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:12:34.855000 audit[3332]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffe6a384a0 a2=0 a3=1 items=0 ppid=3060 pid=3332 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:34.855000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:12:34.861000 audit[3332]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3332 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:12:34.861000 audit[3332]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffe6a384a0 a2=0 a3=1 items=0 ppid=3060 pid=3332 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:34.861000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:12:39.294000 audit[3335]: NETFILTER_CFG table=filter:109 family=2 entries=17 op=nft_register_rule pid=3335 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:12:39.297775 kernel: kauditd_printk_skb: 7 callbacks suppressed Dec 16 12:12:39.297880 kernel: audit: type=1325 audit(1765887159.294:525): table=filter:109 family=2 entries=17 op=nft_register_rule pid=3335 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:12:39.294000 audit[3335]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=ffffd22067f0 a2=0 a3=1 items=0 ppid=3060 pid=3335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:39.294000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:12:39.304646 kernel: audit: type=1300 audit(1765887159.294:525): arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=ffffd22067f0 a2=0 a3=1 items=0 ppid=3060 pid=3335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:39.304719 kernel: audit: type=1327 audit(1765887159.294:525): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:12:39.307000 audit[3335]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3335 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:12:39.307000 audit[3335]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffd22067f0 a2=0 a3=1 items=0 ppid=3060 pid=3335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:39.313947 kernel: audit: type=1325 audit(1765887159.307:526): table=nat:110 family=2 entries=12 op=nft_register_rule pid=3335 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:12:39.314009 kernel: audit: type=1300 audit(1765887159.307:526): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffd22067f0 a2=0 a3=1 items=0 ppid=3060 pid=3335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:39.307000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:12:39.315792 kernel: audit: type=1327 audit(1765887159.307:526): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:12:39.328000 audit[3337]: NETFILTER_CFG table=filter:111 family=2 entries=18 op=nft_register_rule pid=3337 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:12:39.328000 audit[3337]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=ffffc4ed9c20 a2=0 a3=1 items=0 ppid=3060 pid=3337 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:39.335637 kernel: audit: type=1325 audit(1765887159.328:527): table=filter:111 family=2 entries=18 op=nft_register_rule pid=3337 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:12:39.335690 kernel: audit: type=1300 audit(1765887159.328:527): arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=ffffc4ed9c20 a2=0 a3=1 items=0 ppid=3060 pid=3337 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:39.328000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:12:39.337587 kernel: audit: type=1327 audit(1765887159.328:527): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:12:39.336000 audit[3337]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3337 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:12:39.339499 kernel: audit: type=1325 audit(1765887159.336:528): table=nat:112 family=2 entries=12 op=nft_register_rule pid=3337 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:12:39.336000 audit[3337]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffc4ed9c20 a2=0 a3=1 items=0 ppid=3060 pid=3337 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:39.336000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:12:40.709000 audit[3339]: NETFILTER_CFG table=filter:113 family=2 entries=19 op=nft_register_rule pid=3339 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:12:40.709000 audit[3339]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffec9a56d0 a2=0 a3=1 items=0 ppid=3060 pid=3339 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:40.709000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:12:40.715000 audit[3339]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3339 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:12:40.715000 audit[3339]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffec9a56d0 a2=0 a3=1 items=0 ppid=3060 pid=3339 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:40.715000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:12:41.725000 audit[3341]: NETFILTER_CFG table=filter:115 family=2 entries=20 op=nft_register_rule pid=3341 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:12:41.725000 audit[3341]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=fffff68420f0 a2=0 a3=1 items=0 ppid=3060 pid=3341 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:41.725000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:12:41.735000 audit[3341]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3341 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:12:41.735000 audit[3341]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff68420f0 a2=0 a3=1 items=0 ppid=3060 pid=3341 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:41.735000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:12:43.771000 audit[3344]: NETFILTER_CFG table=filter:117 family=2 entries=21 op=nft_register_rule pid=3344 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:12:43.771000 audit[3344]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffd4bdd440 a2=0 a3=1 items=0 ppid=3060 pid=3344 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:43.771000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:12:43.785000 audit[3344]: NETFILTER_CFG table=nat:118 family=2 entries=12 op=nft_register_rule pid=3344 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:12:43.785000 audit[3344]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffd4bdd440 a2=0 a3=1 items=0 ppid=3060 pid=3344 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:43.785000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:12:43.800160 systemd[1]: Created slice kubepods-besteffort-pod26945041_dd74_4312_b830_9aab04107d05.slice - libcontainer container kubepods-besteffort-pod26945041_dd74_4312_b830_9aab04107d05.slice. Dec 16 12:12:43.805000 audit[3346]: NETFILTER_CFG table=filter:119 family=2 entries=22 op=nft_register_rule pid=3346 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:12:43.805000 audit[3346]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffe488e4e0 a2=0 a3=1 items=0 ppid=3060 pid=3346 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:43.805000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:12:43.824000 audit[3346]: NETFILTER_CFG table=nat:120 family=2 entries=12 op=nft_register_rule pid=3346 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:12:43.824000 audit[3346]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffe488e4e0 a2=0 a3=1 items=0 ppid=3060 pid=3346 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:43.824000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:12:43.830648 kubelet[2916]: I1216 12:12:43.830586 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/26945041-dd74-4312-b830-9aab04107d05-typha-certs\") pod \"calico-typha-676bc459d7-rd9m7\" (UID: \"26945041-dd74-4312-b830-9aab04107d05\") " pod="calico-system/calico-typha-676bc459d7-rd9m7" Dec 16 12:12:43.831106 kubelet[2916]: I1216 12:12:43.830670 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/26945041-dd74-4312-b830-9aab04107d05-tigera-ca-bundle\") pod \"calico-typha-676bc459d7-rd9m7\" (UID: \"26945041-dd74-4312-b830-9aab04107d05\") " pod="calico-system/calico-typha-676bc459d7-rd9m7" Dec 16 12:12:43.831106 kubelet[2916]: I1216 12:12:43.830699 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tgmx\" (UniqueName: \"kubernetes.io/projected/26945041-dd74-4312-b830-9aab04107d05-kube-api-access-6tgmx\") pod \"calico-typha-676bc459d7-rd9m7\" (UID: \"26945041-dd74-4312-b830-9aab04107d05\") " pod="calico-system/calico-typha-676bc459d7-rd9m7" Dec 16 12:12:43.971375 systemd[1]: Created slice kubepods-besteffort-pod9217f0b5_385c_4a69_8780_fbf6a919844a.slice - libcontainer container kubepods-besteffort-pod9217f0b5_385c_4a69_8780_fbf6a919844a.slice. Dec 16 12:12:44.032813 kubelet[2916]: I1216 12:12:44.032699 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/9217f0b5-385c-4a69-8780-fbf6a919844a-node-certs\") pod \"calico-node-wcj42\" (UID: \"9217f0b5-385c-4a69-8780-fbf6a919844a\") " pod="calico-system/calico-node-wcj42" Dec 16 12:12:44.032813 kubelet[2916]: I1216 12:12:44.032745 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/9217f0b5-385c-4a69-8780-fbf6a919844a-xtables-lock\") pod \"calico-node-wcj42\" (UID: \"9217f0b5-385c-4a69-8780-fbf6a919844a\") " pod="calico-system/calico-node-wcj42" Dec 16 12:12:44.032813 kubelet[2916]: I1216 12:12:44.032782 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/9217f0b5-385c-4a69-8780-fbf6a919844a-cni-net-dir\") pod \"calico-node-wcj42\" (UID: \"9217f0b5-385c-4a69-8780-fbf6a919844a\") " pod="calico-system/calico-node-wcj42" Dec 16 12:12:44.032813 kubelet[2916]: I1216 12:12:44.032800 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tq8hd\" (UniqueName: \"kubernetes.io/projected/9217f0b5-385c-4a69-8780-fbf6a919844a-kube-api-access-tq8hd\") pod \"calico-node-wcj42\" (UID: \"9217f0b5-385c-4a69-8780-fbf6a919844a\") " pod="calico-system/calico-node-wcj42" Dec 16 12:12:44.032813 kubelet[2916]: I1216 12:12:44.032816 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/9217f0b5-385c-4a69-8780-fbf6a919844a-cni-log-dir\") pod \"calico-node-wcj42\" (UID: \"9217f0b5-385c-4a69-8780-fbf6a919844a\") " pod="calico-system/calico-node-wcj42" Dec 16 12:12:44.033000 kubelet[2916]: I1216 12:12:44.032830 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/9217f0b5-385c-4a69-8780-fbf6a919844a-flexvol-driver-host\") pod \"calico-node-wcj42\" (UID: \"9217f0b5-385c-4a69-8780-fbf6a919844a\") " pod="calico-system/calico-node-wcj42" Dec 16 12:12:44.033000 kubelet[2916]: I1216 12:12:44.032848 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/9217f0b5-385c-4a69-8780-fbf6a919844a-cni-bin-dir\") pod \"calico-node-wcj42\" (UID: \"9217f0b5-385c-4a69-8780-fbf6a919844a\") " pod="calico-system/calico-node-wcj42" Dec 16 12:12:44.033000 kubelet[2916]: I1216 12:12:44.032890 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/9217f0b5-385c-4a69-8780-fbf6a919844a-policysync\") pod \"calico-node-wcj42\" (UID: \"9217f0b5-385c-4a69-8780-fbf6a919844a\") " pod="calico-system/calico-node-wcj42" Dec 16 12:12:44.033000 kubelet[2916]: I1216 12:12:44.032921 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/9217f0b5-385c-4a69-8780-fbf6a919844a-var-lib-calico\") pod \"calico-node-wcj42\" (UID: \"9217f0b5-385c-4a69-8780-fbf6a919844a\") " pod="calico-system/calico-node-wcj42" Dec 16 12:12:44.033000 kubelet[2916]: I1216 12:12:44.032961 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/9217f0b5-385c-4a69-8780-fbf6a919844a-var-run-calico\") pod \"calico-node-wcj42\" (UID: \"9217f0b5-385c-4a69-8780-fbf6a919844a\") " pod="calico-system/calico-node-wcj42" Dec 16 12:12:44.033094 kubelet[2916]: I1216 12:12:44.033025 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9217f0b5-385c-4a69-8780-fbf6a919844a-lib-modules\") pod \"calico-node-wcj42\" (UID: \"9217f0b5-385c-4a69-8780-fbf6a919844a\") " pod="calico-system/calico-node-wcj42" Dec 16 12:12:44.033116 kubelet[2916]: I1216 12:12:44.033077 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9217f0b5-385c-4a69-8780-fbf6a919844a-tigera-ca-bundle\") pod \"calico-node-wcj42\" (UID: \"9217f0b5-385c-4a69-8780-fbf6a919844a\") " pod="calico-system/calico-node-wcj42" Dec 16 12:12:44.105504 containerd[1662]: time="2025-12-16T12:12:44.105448991Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-676bc459d7-rd9m7,Uid:26945041-dd74-4312-b830-9aab04107d05,Namespace:calico-system,Attempt:0,}" Dec 16 12:12:44.125135 containerd[1662]: time="2025-12-16T12:12:44.125093285Z" level=info msg="connecting to shim 1a3a900b7f8eea77f22554e645cb72557143119aa880cd2fba07336a88f416fd" address="unix:///run/containerd/s/c390bfd4407ac83963475985829d73a9ff51b3006b1295fc26f3023086e83433" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:12:44.137763 kubelet[2916]: E1216 12:12:44.137734 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.137763 kubelet[2916]: W1216 12:12:44.137757 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.137890 kubelet[2916]: E1216 12:12:44.137782 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.138150 kubelet[2916]: E1216 12:12:44.138133 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.138189 kubelet[2916]: W1216 12:12:44.138149 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.138189 kubelet[2916]: E1216 12:12:44.138161 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.138385 kubelet[2916]: E1216 12:12:44.138370 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.138385 kubelet[2916]: W1216 12:12:44.138383 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.138457 kubelet[2916]: E1216 12:12:44.138394 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.138563 kubelet[2916]: E1216 12:12:44.138551 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.138604 kubelet[2916]: W1216 12:12:44.138564 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.138604 kubelet[2916]: E1216 12:12:44.138573 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.138747 kubelet[2916]: E1216 12:12:44.138734 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.138747 kubelet[2916]: W1216 12:12:44.138746 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.138830 kubelet[2916]: E1216 12:12:44.138754 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.138907 kubelet[2916]: E1216 12:12:44.138895 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.138907 kubelet[2916]: W1216 12:12:44.138906 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.138994 kubelet[2916]: E1216 12:12:44.138914 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.139062 kubelet[2916]: E1216 12:12:44.139049 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.139062 kubelet[2916]: W1216 12:12:44.139060 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.139172 kubelet[2916]: E1216 12:12:44.139068 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.139253 kubelet[2916]: E1216 12:12:44.139235 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.139253 kubelet[2916]: W1216 12:12:44.139246 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.139253 kubelet[2916]: E1216 12:12:44.139254 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.139500 kubelet[2916]: E1216 12:12:44.139474 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.139500 kubelet[2916]: W1216 12:12:44.139496 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.139781 kubelet[2916]: E1216 12:12:44.139508 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.139781 kubelet[2916]: E1216 12:12:44.139721 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.139781 kubelet[2916]: W1216 12:12:44.139730 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.139781 kubelet[2916]: E1216 12:12:44.139741 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.139936 kubelet[2916]: E1216 12:12:44.139905 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.139936 kubelet[2916]: W1216 12:12:44.139918 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.139936 kubelet[2916]: E1216 12:12:44.139926 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.140871 kubelet[2916]: E1216 12:12:44.140849 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.140871 kubelet[2916]: W1216 12:12:44.140866 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.141012 kubelet[2916]: E1216 12:12:44.140878 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.141129 kubelet[2916]: E1216 12:12:44.141111 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.141129 kubelet[2916]: W1216 12:12:44.141124 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.141217 kubelet[2916]: E1216 12:12:44.141134 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.141376 kubelet[2916]: E1216 12:12:44.141360 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.141376 kubelet[2916]: W1216 12:12:44.141374 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.141456 kubelet[2916]: E1216 12:12:44.141383 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.141602 kubelet[2916]: E1216 12:12:44.141588 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.141602 kubelet[2916]: W1216 12:12:44.141601 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.141745 kubelet[2916]: E1216 12:12:44.141610 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.141839 kubelet[2916]: E1216 12:12:44.141828 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.141839 kubelet[2916]: W1216 12:12:44.141838 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.141940 kubelet[2916]: E1216 12:12:44.141847 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.142232 kubelet[2916]: E1216 12:12:44.142219 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.142267 kubelet[2916]: W1216 12:12:44.142232 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.142267 kubelet[2916]: E1216 12:12:44.142243 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.142906 kubelet[2916]: E1216 12:12:44.142891 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.142906 kubelet[2916]: W1216 12:12:44.142904 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.142988 kubelet[2916]: E1216 12:12:44.142916 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.143141 kubelet[2916]: E1216 12:12:44.143127 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.143141 kubelet[2916]: W1216 12:12:44.143140 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.143190 kubelet[2916]: E1216 12:12:44.143150 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.143412 kubelet[2916]: E1216 12:12:44.143399 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.143451 kubelet[2916]: W1216 12:12:44.143412 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.143451 kubelet[2916]: E1216 12:12:44.143422 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.144146 kubelet[2916]: E1216 12:12:44.144100 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.144146 kubelet[2916]: W1216 12:12:44.144146 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.144207 kubelet[2916]: E1216 12:12:44.144159 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.145092 kubelet[2916]: E1216 12:12:44.145070 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.145092 kubelet[2916]: W1216 12:12:44.145090 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.145193 kubelet[2916]: E1216 12:12:44.145103 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.145355 kubelet[2916]: E1216 12:12:44.145339 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.145355 kubelet[2916]: W1216 12:12:44.145351 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.145407 kubelet[2916]: E1216 12:12:44.145361 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.145535 kubelet[2916]: E1216 12:12:44.145524 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.145535 kubelet[2916]: W1216 12:12:44.145535 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.145597 kubelet[2916]: E1216 12:12:44.145543 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.145821 kubelet[2916]: E1216 12:12:44.145806 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.145821 kubelet[2916]: W1216 12:12:44.145820 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.145878 kubelet[2916]: E1216 12:12:44.145855 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.150856 kubelet[2916]: E1216 12:12:44.150824 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.150856 kubelet[2916]: W1216 12:12:44.150845 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.150856 kubelet[2916]: E1216 12:12:44.150859 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.158117 systemd[1]: Started cri-containerd-1a3a900b7f8eea77f22554e645cb72557143119aa880cd2fba07336a88f416fd.scope - libcontainer container 1a3a900b7f8eea77f22554e645cb72557143119aa880cd2fba07336a88f416fd. Dec 16 12:12:44.167000 audit: BPF prog-id=151 op=LOAD Dec 16 12:12:44.167000 audit: BPF prog-id=152 op=LOAD Dec 16 12:12:44.167000 audit[3369]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=3358 pid=3369 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:44.167000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161336139303062376638656561373766323235353465363435636237 Dec 16 12:12:44.167000 audit: BPF prog-id=152 op=UNLOAD Dec 16 12:12:44.167000 audit[3369]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3358 pid=3369 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:44.167000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161336139303062376638656561373766323235353465363435636237 Dec 16 12:12:44.167000 audit: BPF prog-id=153 op=LOAD Dec 16 12:12:44.167000 audit[3369]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=3358 pid=3369 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:44.167000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161336139303062376638656561373766323235353465363435636237 Dec 16 12:12:44.168000 audit: BPF prog-id=154 op=LOAD Dec 16 12:12:44.168000 audit[3369]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=40001a0168 a2=98 a3=0 items=0 ppid=3358 pid=3369 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:44.168000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161336139303062376638656561373766323235353465363435636237 Dec 16 12:12:44.168000 audit: BPF prog-id=154 op=UNLOAD Dec 16 12:12:44.168000 audit[3369]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3358 pid=3369 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:44.168000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161336139303062376638656561373766323235353465363435636237 Dec 16 12:12:44.168000 audit: BPF prog-id=153 op=UNLOAD Dec 16 12:12:44.168000 audit[3369]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3358 pid=3369 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:44.168000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161336139303062376638656561373766323235353465363435636237 Dec 16 12:12:44.168000 audit: BPF prog-id=155 op=LOAD Dec 16 12:12:44.168000 audit[3369]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001a0648 a2=98 a3=0 items=0 ppid=3358 pid=3369 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:44.168000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3161336139303062376638656561373766323235353465363435636237 Dec 16 12:12:44.188139 containerd[1662]: time="2025-12-16T12:12:44.188087341Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-676bc459d7-rd9m7,Uid:26945041-dd74-4312-b830-9aab04107d05,Namespace:calico-system,Attempt:0,} returns sandbox id \"1a3a900b7f8eea77f22554e645cb72557143119aa880cd2fba07336a88f416fd\"" Dec 16 12:12:44.189574 containerd[1662]: time="2025-12-16T12:12:44.189334904Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Dec 16 12:12:44.507072 kubelet[2916]: E1216 12:12:44.506997 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.507072 kubelet[2916]: W1216 12:12:44.507018 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.507072 kubelet[2916]: E1216 12:12:44.507035 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.516888 kubelet[2916]: E1216 12:12:44.516826 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cfgbh" podUID="15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4" Dec 16 12:12:44.519603 kubelet[2916]: E1216 12:12:44.519567 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.519758 kubelet[2916]: W1216 12:12:44.519624 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.520084 kubelet[2916]: E1216 12:12:44.519835 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.520573 kubelet[2916]: E1216 12:12:44.520550 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.520646 kubelet[2916]: W1216 12:12:44.520569 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.520646 kubelet[2916]: E1216 12:12:44.520611 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.521509 kubelet[2916]: E1216 12:12:44.521483 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.521509 kubelet[2916]: W1216 12:12:44.521505 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.521580 kubelet[2916]: E1216 12:12:44.521520 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.521826 kubelet[2916]: E1216 12:12:44.521806 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.521826 kubelet[2916]: W1216 12:12:44.521821 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.521988 kubelet[2916]: E1216 12:12:44.521950 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.522771 kubelet[2916]: E1216 12:12:44.522734 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.522771 kubelet[2916]: W1216 12:12:44.522756 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.522771 kubelet[2916]: E1216 12:12:44.522770 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.522995 kubelet[2916]: E1216 12:12:44.522969 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.523036 kubelet[2916]: W1216 12:12:44.522985 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.523036 kubelet[2916]: E1216 12:12:44.523018 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.523314 kubelet[2916]: E1216 12:12:44.523302 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.523314 kubelet[2916]: W1216 12:12:44.523314 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.523364 kubelet[2916]: E1216 12:12:44.523326 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.523575 kubelet[2916]: E1216 12:12:44.523483 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.523575 kubelet[2916]: W1216 12:12:44.523494 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.523575 kubelet[2916]: E1216 12:12:44.523501 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.524360 kubelet[2916]: E1216 12:12:44.524250 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.524360 kubelet[2916]: W1216 12:12:44.524262 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.524360 kubelet[2916]: E1216 12:12:44.524274 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.524420 kubelet[2916]: E1216 12:12:44.524411 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.524420 kubelet[2916]: W1216 12:12:44.524418 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.524455 kubelet[2916]: E1216 12:12:44.524426 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.524622 kubelet[2916]: E1216 12:12:44.524601 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.524622 kubelet[2916]: W1216 12:12:44.524617 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.525013 kubelet[2916]: E1216 12:12:44.524982 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.525444 kubelet[2916]: E1216 12:12:44.525400 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.525444 kubelet[2916]: W1216 12:12:44.525437 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.525524 kubelet[2916]: E1216 12:12:44.525458 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.526298 kubelet[2916]: E1216 12:12:44.526266 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.526298 kubelet[2916]: W1216 12:12:44.526288 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.526298 kubelet[2916]: E1216 12:12:44.526302 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.526835 kubelet[2916]: E1216 12:12:44.526805 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.526835 kubelet[2916]: W1216 12:12:44.526828 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.526835 kubelet[2916]: E1216 12:12:44.526840 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.527740 kubelet[2916]: E1216 12:12:44.527688 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.527740 kubelet[2916]: W1216 12:12:44.527725 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.527740 kubelet[2916]: E1216 12:12:44.527740 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.529511 kubelet[2916]: E1216 12:12:44.529484 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.529511 kubelet[2916]: W1216 12:12:44.529503 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.529687 kubelet[2916]: E1216 12:12:44.529520 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.529844 kubelet[2916]: E1216 12:12:44.529756 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.529844 kubelet[2916]: W1216 12:12:44.529765 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.529844 kubelet[2916]: E1216 12:12:44.529776 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.529947 kubelet[2916]: E1216 12:12:44.529932 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.529947 kubelet[2916]: W1216 12:12:44.529943 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.530018 kubelet[2916]: E1216 12:12:44.529954 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.530131 kubelet[2916]: E1216 12:12:44.530070 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.530131 kubelet[2916]: W1216 12:12:44.530079 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.530131 kubelet[2916]: E1216 12:12:44.530086 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.530605 kubelet[2916]: E1216 12:12:44.530258 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.530605 kubelet[2916]: W1216 12:12:44.530265 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.530605 kubelet[2916]: E1216 12:12:44.530272 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.538982 kubelet[2916]: E1216 12:12:44.538961 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.539235 kubelet[2916]: W1216 12:12:44.539086 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.539235 kubelet[2916]: E1216 12:12:44.539110 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.539235 kubelet[2916]: I1216 12:12:44.539144 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4-registration-dir\") pod \"csi-node-driver-cfgbh\" (UID: \"15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4\") " pod="calico-system/csi-node-driver-cfgbh" Dec 16 12:12:44.539468 kubelet[2916]: E1216 12:12:44.539450 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.539590 kubelet[2916]: W1216 12:12:44.539518 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.539590 kubelet[2916]: E1216 12:12:44.539534 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.539590 kubelet[2916]: I1216 12:12:44.539559 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4-socket-dir\") pod \"csi-node-driver-cfgbh\" (UID: \"15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4\") " pod="calico-system/csi-node-driver-cfgbh" Dec 16 12:12:44.540851 kubelet[2916]: E1216 12:12:44.540786 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.540851 kubelet[2916]: W1216 12:12:44.540803 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.540851 kubelet[2916]: E1216 12:12:44.540815 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.541530 kubelet[2916]: E1216 12:12:44.541470 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.541530 kubelet[2916]: W1216 12:12:44.541484 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.541530 kubelet[2916]: E1216 12:12:44.541496 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.541826 kubelet[2916]: E1216 12:12:44.541794 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.541826 kubelet[2916]: W1216 12:12:44.541806 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.541826 kubelet[2916]: E1216 12:12:44.541815 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.542123 kubelet[2916]: I1216 12:12:44.542108 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4-varrun\") pod \"csi-node-driver-cfgbh\" (UID: \"15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4\") " pod="calico-system/csi-node-driver-cfgbh" Dec 16 12:12:44.542329 kubelet[2916]: E1216 12:12:44.542317 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.542389 kubelet[2916]: W1216 12:12:44.542379 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.542438 kubelet[2916]: E1216 12:12:44.542428 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.542670 kubelet[2916]: E1216 12:12:44.542658 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.542748 kubelet[2916]: W1216 12:12:44.542736 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.542798 kubelet[2916]: E1216 12:12:44.542787 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.543969 kubelet[2916]: E1216 12:12:44.543869 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.544452 kubelet[2916]: W1216 12:12:44.544099 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.544452 kubelet[2916]: E1216 12:12:44.544118 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.545004 kubelet[2916]: E1216 12:12:44.544896 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.545202 kubelet[2916]: W1216 12:12:44.545183 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.545365 kubelet[2916]: E1216 12:12:44.545350 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.546190 kubelet[2916]: E1216 12:12:44.545891 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.546190 kubelet[2916]: W1216 12:12:44.545915 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.546190 kubelet[2916]: E1216 12:12:44.545928 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.546190 kubelet[2916]: I1216 12:12:44.545954 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4h5p\" (UniqueName: \"kubernetes.io/projected/15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4-kube-api-access-z4h5p\") pod \"csi-node-driver-cfgbh\" (UID: \"15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4\") " pod="calico-system/csi-node-driver-cfgbh" Dec 16 12:12:44.546536 kubelet[2916]: E1216 12:12:44.546519 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.546703 kubelet[2916]: W1216 12:12:44.546686 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.546785 kubelet[2916]: E1216 12:12:44.546765 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.546850 kubelet[2916]: I1216 12:12:44.546840 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4-kubelet-dir\") pod \"csi-node-driver-cfgbh\" (UID: \"15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4\") " pod="calico-system/csi-node-driver-cfgbh" Dec 16 12:12:44.547315 kubelet[2916]: E1216 12:12:44.547300 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.547414 kubelet[2916]: W1216 12:12:44.547374 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.547414 kubelet[2916]: E1216 12:12:44.547388 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.547989 kubelet[2916]: E1216 12:12:44.547891 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.547989 kubelet[2916]: W1216 12:12:44.547906 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.547989 kubelet[2916]: E1216 12:12:44.547917 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.548261 kubelet[2916]: E1216 12:12:44.548249 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.548261 kubelet[2916]: W1216 12:12:44.548311 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.548261 kubelet[2916]: E1216 12:12:44.548324 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.549309 kubelet[2916]: E1216 12:12:44.549293 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.549426 kubelet[2916]: W1216 12:12:44.549387 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.549426 kubelet[2916]: E1216 12:12:44.549405 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.575795 containerd[1662]: time="2025-12-16T12:12:44.575759820Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-wcj42,Uid:9217f0b5-385c-4a69-8780-fbf6a919844a,Namespace:calico-system,Attempt:0,}" Dec 16 12:12:44.604423 containerd[1662]: time="2025-12-16T12:12:44.604371700Z" level=info msg="connecting to shim 23101e0844872f00e9017a5c42b84400c103cd2ac51869df9e22ed63a084429f" address="unix:///run/containerd/s/511a2f4a7b69e5631dafd6f38394ed92a975aae1789a368dee9c5c5b455a0e0e" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:12:44.634889 systemd[1]: Started cri-containerd-23101e0844872f00e9017a5c42b84400c103cd2ac51869df9e22ed63a084429f.scope - libcontainer container 23101e0844872f00e9017a5c42b84400c103cd2ac51869df9e22ed63a084429f. Dec 16 12:12:44.643000 audit: BPF prog-id=156 op=LOAD Dec 16 12:12:44.645348 kernel: kauditd_printk_skb: 48 callbacks suppressed Dec 16 12:12:44.645382 kernel: audit: type=1334 audit(1765887164.643:545): prog-id=156 op=LOAD Dec 16 12:12:44.645396 kernel: audit: type=1334 audit(1765887164.643:546): prog-id=157 op=LOAD Dec 16 12:12:44.643000 audit: BPF prog-id=157 op=LOAD Dec 16 12:12:44.643000 audit[3488]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=3478 pid=3488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:44.648174 kubelet[2916]: E1216 12:12:44.648036 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.648174 kubelet[2916]: W1216 12:12:44.648060 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.648174 kubelet[2916]: E1216 12:12:44.648079 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.648368 kubelet[2916]: E1216 12:12:44.648355 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.648425 kubelet[2916]: W1216 12:12:44.648414 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.648472 kubelet[2916]: E1216 12:12:44.648463 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.648862 kubelet[2916]: E1216 12:12:44.648729 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.648862 kubelet[2916]: W1216 12:12:44.648743 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.648862 kubelet[2916]: E1216 12:12:44.648753 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.649002 kubelet[2916]: E1216 12:12:44.648989 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.649064 kubelet[2916]: W1216 12:12:44.649052 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.649117 kubelet[2916]: E1216 12:12:44.649105 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.649326 kubelet[2916]: E1216 12:12:44.649313 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.649516 kubelet[2916]: W1216 12:12:44.649378 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.649516 kubelet[2916]: E1216 12:12:44.649393 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.649649 kubelet[2916]: E1216 12:12:44.649622 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.649697 kernel: audit: type=1300 audit(1765887164.643:546): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=3478 pid=3488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:44.649754 kubelet[2916]: W1216 12:12:44.649740 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.643000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233313031653038343438373266303065393031376135633432623834 Dec 16 12:12:44.649920 kubelet[2916]: E1216 12:12:44.649841 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.650098 kubelet[2916]: E1216 12:12:44.650084 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.650266 kubelet[2916]: W1216 12:12:44.650160 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.650266 kubelet[2916]: E1216 12:12:44.650176 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.650408 kubelet[2916]: E1216 12:12:44.650374 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.650408 kubelet[2916]: W1216 12:12:44.650385 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.650551 kubelet[2916]: E1216 12:12:44.650474 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.650754 kubelet[2916]: E1216 12:12:44.650741 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.650846 kubelet[2916]: W1216 12:12:44.650817 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.650846 kubelet[2916]: E1216 12:12:44.650833 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.651117 kubelet[2916]: E1216 12:12:44.651084 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.651117 kubelet[2916]: W1216 12:12:44.651096 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.651117 kubelet[2916]: E1216 12:12:44.651105 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.651394 kubelet[2916]: E1216 12:12:44.651362 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.651394 kubelet[2916]: W1216 12:12:44.651374 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.651394 kubelet[2916]: E1216 12:12:44.651382 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.651702 kubelet[2916]: E1216 12:12:44.651669 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.651702 kubelet[2916]: W1216 12:12:44.651681 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.651702 kubelet[2916]: E1216 12:12:44.651690 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.652008 kubelet[2916]: E1216 12:12:44.651971 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.652008 kubelet[2916]: W1216 12:12:44.651987 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.652008 kubelet[2916]: E1216 12:12:44.651995 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.652307 kubelet[2916]: E1216 12:12:44.652274 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.652307 kubelet[2916]: W1216 12:12:44.652287 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.652307 kubelet[2916]: E1216 12:12:44.652295 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.652581 kubelet[2916]: E1216 12:12:44.652545 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.652581 kubelet[2916]: W1216 12:12:44.652557 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.652581 kubelet[2916]: E1216 12:12:44.652568 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.652961 kubelet[2916]: E1216 12:12:44.652925 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.652961 kubelet[2916]: W1216 12:12:44.652939 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.652961 kubelet[2916]: E1216 12:12:44.652949 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.653154 kernel: audit: type=1327 audit(1765887164.643:546): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233313031653038343438373266303065393031376135633432623834 Dec 16 12:12:44.653482 kernel: audit: type=1334 audit(1765887164.644:547): prog-id=157 op=UNLOAD Dec 16 12:12:44.644000 audit: BPF prog-id=157 op=UNLOAD Dec 16 12:12:44.653730 kubelet[2916]: E1216 12:12:44.653692 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.653730 kubelet[2916]: W1216 12:12:44.653705 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.653730 kubelet[2916]: E1216 12:12:44.653716 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.644000 audit[3488]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3478 pid=3488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:44.654715 kubelet[2916]: E1216 12:12:44.654540 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.654715 kubelet[2916]: W1216 12:12:44.654555 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.654715 kubelet[2916]: E1216 12:12:44.654566 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.655785 kubelet[2916]: E1216 12:12:44.655746 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.655785 kubelet[2916]: W1216 12:12:44.655760 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.655785 kubelet[2916]: E1216 12:12:44.655771 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.656131 kubelet[2916]: E1216 12:12:44.656094 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.656131 kubelet[2916]: W1216 12:12:44.656107 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.656131 kubelet[2916]: E1216 12:12:44.656116 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.656472 kubelet[2916]: E1216 12:12:44.656384 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.656472 kubelet[2916]: W1216 12:12:44.656395 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.656472 kubelet[2916]: E1216 12:12:44.656404 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.656663 kubelet[2916]: E1216 12:12:44.656650 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.656720 kubelet[2916]: W1216 12:12:44.656709 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.656778 kubelet[2916]: E1216 12:12:44.656767 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.657085 kubelet[2916]: E1216 12:12:44.657069 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.657434 kernel: audit: type=1300 audit(1765887164.644:547): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3478 pid=3488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:44.657499 kernel: audit: type=1327 audit(1765887164.644:547): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233313031653038343438373266303065393031376135633432623834 Dec 16 12:12:44.644000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233313031653038343438373266303065393031376135633432623834 Dec 16 12:12:44.657560 kubelet[2916]: W1216 12:12:44.657144 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.657560 kubelet[2916]: E1216 12:12:44.657160 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.658179 kubelet[2916]: E1216 12:12:44.658163 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.658257 kubelet[2916]: W1216 12:12:44.658245 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.658329 kubelet[2916]: E1216 12:12:44.658318 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.659101 kubelet[2916]: E1216 12:12:44.659080 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.659181 kubelet[2916]: W1216 12:12:44.659169 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.659366 kubelet[2916]: E1216 12:12:44.659237 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.644000 audit: BPF prog-id=158 op=LOAD Dec 16 12:12:44.661893 kernel: audit: type=1334 audit(1765887164.644:548): prog-id=158 op=LOAD Dec 16 12:12:44.644000 audit[3488]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3478 pid=3488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:44.665443 kernel: audit: type=1300 audit(1765887164.644:548): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3478 pid=3488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:44.644000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233313031653038343438373266303065393031376135633432623834 Dec 16 12:12:44.668892 kernel: audit: type=1327 audit(1765887164.644:548): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233313031653038343438373266303065393031376135633432623834 Dec 16 12:12:44.645000 audit: BPF prog-id=159 op=LOAD Dec 16 12:12:44.645000 audit[3488]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3478 pid=3488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:44.645000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233313031653038343438373266303065393031376135633432623834 Dec 16 12:12:44.649000 audit: BPF prog-id=159 op=UNLOAD Dec 16 12:12:44.649000 audit[3488]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3478 pid=3488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:44.649000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233313031653038343438373266303065393031376135633432623834 Dec 16 12:12:44.649000 audit: BPF prog-id=158 op=UNLOAD Dec 16 12:12:44.649000 audit[3488]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3478 pid=3488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:44.649000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233313031653038343438373266303065393031376135633432623834 Dec 16 12:12:44.649000 audit: BPF prog-id=160 op=LOAD Dec 16 12:12:44.649000 audit[3488]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=3478 pid=3488 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:44.649000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3233313031653038343438373266303065393031376135633432623834 Dec 16 12:12:44.672981 kubelet[2916]: E1216 12:12:44.672920 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:44.672981 kubelet[2916]: W1216 12:12:44.672940 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:44.672981 kubelet[2916]: E1216 12:12:44.672963 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:44.681551 containerd[1662]: time="2025-12-16T12:12:44.681516155Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-wcj42,Uid:9217f0b5-385c-4a69-8780-fbf6a919844a,Namespace:calico-system,Attempt:0,} returns sandbox id \"23101e0844872f00e9017a5c42b84400c103cd2ac51869df9e22ed63a084429f\"" Dec 16 12:12:44.840000 audit[3544]: NETFILTER_CFG table=filter:121 family=2 entries=22 op=nft_register_rule pid=3544 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:12:44.840000 audit[3544]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffe3a38670 a2=0 a3=1 items=0 ppid=3060 pid=3544 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:44.840000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:12:44.845000 audit[3544]: NETFILTER_CFG table=nat:122 family=2 entries=12 op=nft_register_rule pid=3544 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:12:44.845000 audit[3544]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffe3a38670 a2=0 a3=1 items=0 ppid=3060 pid=3544 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:44.845000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:12:45.709862 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2464484267.mount: Deactivated successfully. Dec 16 12:12:46.064372 kubelet[2916]: E1216 12:12:46.064127 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cfgbh" podUID="15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4" Dec 16 12:12:46.985279 containerd[1662]: time="2025-12-16T12:12:46.985213651Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:12:46.986588 containerd[1662]: time="2025-12-16T12:12:46.986545894Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=31716861" Dec 16 12:12:46.988213 containerd[1662]: time="2025-12-16T12:12:46.987683377Z" level=info msg="ImageCreate event name:\"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:12:46.990298 containerd[1662]: time="2025-12-16T12:12:46.990256225Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:12:46.990963 containerd[1662]: time="2025-12-16T12:12:46.990928906Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"33090541\" in 2.801550762s" Dec 16 12:12:46.991125 containerd[1662]: time="2025-12-16T12:12:46.990967587Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\"" Dec 16 12:12:46.992462 containerd[1662]: time="2025-12-16T12:12:46.992438111Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Dec 16 12:12:47.002392 containerd[1662]: time="2025-12-16T12:12:47.002339338Z" level=info msg="CreateContainer within sandbox \"1a3a900b7f8eea77f22554e645cb72557143119aa880cd2fba07336a88f416fd\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Dec 16 12:12:47.010170 containerd[1662]: time="2025-12-16T12:12:47.010097880Z" level=info msg="Container 9b47e06706ed8bf5319aacf3652680754744acfaa5ce99ad966d0a60927a3e73: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:12:47.019396 containerd[1662]: time="2025-12-16T12:12:47.019337506Z" level=info msg="CreateContainer within sandbox \"1a3a900b7f8eea77f22554e645cb72557143119aa880cd2fba07336a88f416fd\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"9b47e06706ed8bf5319aacf3652680754744acfaa5ce99ad966d0a60927a3e73\"" Dec 16 12:12:47.019906 containerd[1662]: time="2025-12-16T12:12:47.019798667Z" level=info msg="StartContainer for \"9b47e06706ed8bf5319aacf3652680754744acfaa5ce99ad966d0a60927a3e73\"" Dec 16 12:12:47.021006 containerd[1662]: time="2025-12-16T12:12:47.020973470Z" level=info msg="connecting to shim 9b47e06706ed8bf5319aacf3652680754744acfaa5ce99ad966d0a60927a3e73" address="unix:///run/containerd/s/c390bfd4407ac83963475985829d73a9ff51b3006b1295fc26f3023086e83433" protocol=ttrpc version=3 Dec 16 12:12:47.043926 systemd[1]: Started cri-containerd-9b47e06706ed8bf5319aacf3652680754744acfaa5ce99ad966d0a60927a3e73.scope - libcontainer container 9b47e06706ed8bf5319aacf3652680754744acfaa5ce99ad966d0a60927a3e73. Dec 16 12:12:47.055000 audit: BPF prog-id=161 op=LOAD Dec 16 12:12:47.055000 audit: BPF prog-id=162 op=LOAD Dec 16 12:12:47.055000 audit[3555]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=3358 pid=3555 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:47.055000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962343765303637303665643862663533313961616366333635323638 Dec 16 12:12:47.056000 audit: BPF prog-id=162 op=UNLOAD Dec 16 12:12:47.056000 audit[3555]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3358 pid=3555 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:47.056000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962343765303637303665643862663533313961616366333635323638 Dec 16 12:12:47.056000 audit: BPF prog-id=163 op=LOAD Dec 16 12:12:47.056000 audit[3555]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3358 pid=3555 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:47.056000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962343765303637303665643862663533313961616366333635323638 Dec 16 12:12:47.056000 audit: BPF prog-id=164 op=LOAD Dec 16 12:12:47.056000 audit[3555]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3358 pid=3555 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:47.056000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962343765303637303665643862663533313961616366333635323638 Dec 16 12:12:47.056000 audit: BPF prog-id=164 op=UNLOAD Dec 16 12:12:47.056000 audit[3555]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3358 pid=3555 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:47.056000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962343765303637303665643862663533313961616366333635323638 Dec 16 12:12:47.056000 audit: BPF prog-id=163 op=UNLOAD Dec 16 12:12:47.056000 audit[3555]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3358 pid=3555 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:47.056000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962343765303637303665643862663533313961616366333635323638 Dec 16 12:12:47.056000 audit: BPF prog-id=165 op=LOAD Dec 16 12:12:47.056000 audit[3555]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3358 pid=3555 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:47.056000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3962343765303637303665643862663533313961616366333635323638 Dec 16 12:12:47.080702 containerd[1662]: time="2025-12-16T12:12:47.080589716Z" level=info msg="StartContainer for \"9b47e06706ed8bf5319aacf3652680754744acfaa5ce99ad966d0a60927a3e73\" returns successfully" Dec 16 12:12:47.176245 kubelet[2916]: I1216 12:12:47.176180 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-676bc459d7-rd9m7" podStartSLOduration=1.373452097 podStartE2EDuration="4.176162102s" podCreationTimestamp="2025-12-16 12:12:43 +0000 UTC" firstStartedPulling="2025-12-16 12:12:44.189122384 +0000 UTC m=+27.201434307" lastFinishedPulling="2025-12-16 12:12:46.991832429 +0000 UTC m=+30.004144312" observedRunningTime="2025-12-16 12:12:47.175711621 +0000 UTC m=+30.188023584" watchObservedRunningTime="2025-12-16 12:12:47.176162102 +0000 UTC m=+30.188474065" Dec 16 12:12:47.249465 kubelet[2916]: E1216 12:12:47.249350 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.249465 kubelet[2916]: W1216 12:12:47.249377 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.249465 kubelet[2916]: E1216 12:12:47.249398 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.250652 kubelet[2916]: E1216 12:12:47.249642 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.250652 kubelet[2916]: W1216 12:12:47.249656 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.250652 kubelet[2916]: E1216 12:12:47.249701 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.250652 kubelet[2916]: E1216 12:12:47.249993 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.250652 kubelet[2916]: W1216 12:12:47.250004 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.250652 kubelet[2916]: E1216 12:12:47.250013 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.250652 kubelet[2916]: E1216 12:12:47.250218 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.250652 kubelet[2916]: W1216 12:12:47.250227 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.250652 kubelet[2916]: E1216 12:12:47.250236 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.250652 kubelet[2916]: E1216 12:12:47.250391 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.250929 kubelet[2916]: W1216 12:12:47.250399 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.250929 kubelet[2916]: E1216 12:12:47.250407 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.250929 kubelet[2916]: E1216 12:12:47.250652 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.250929 kubelet[2916]: W1216 12:12:47.250662 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.250929 kubelet[2916]: E1216 12:12:47.250672 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.250929 kubelet[2916]: E1216 12:12:47.250809 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.250929 kubelet[2916]: W1216 12:12:47.250815 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.250929 kubelet[2916]: E1216 12:12:47.250822 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.251095 kubelet[2916]: E1216 12:12:47.251068 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.251095 kubelet[2916]: W1216 12:12:47.251084 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.251095 kubelet[2916]: E1216 12:12:47.251093 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.251287 kubelet[2916]: E1216 12:12:47.251267 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.251287 kubelet[2916]: W1216 12:12:47.251284 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.251367 kubelet[2916]: E1216 12:12:47.251299 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.251548 kubelet[2916]: E1216 12:12:47.251519 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.251548 kubelet[2916]: W1216 12:12:47.251532 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.251548 kubelet[2916]: E1216 12:12:47.251542 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.251734 kubelet[2916]: E1216 12:12:47.251712 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.251734 kubelet[2916]: W1216 12:12:47.251726 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.251734 kubelet[2916]: E1216 12:12:47.251735 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.251912 kubelet[2916]: E1216 12:12:47.251890 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.251912 kubelet[2916]: W1216 12:12:47.251901 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.251912 kubelet[2916]: E1216 12:12:47.251909 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.252072 kubelet[2916]: E1216 12:12:47.252053 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.252072 kubelet[2916]: W1216 12:12:47.252062 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.252072 kubelet[2916]: E1216 12:12:47.252069 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.252210 kubelet[2916]: E1216 12:12:47.252191 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.252210 kubelet[2916]: W1216 12:12:47.252200 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.252210 kubelet[2916]: E1216 12:12:47.252207 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.252343 kubelet[2916]: E1216 12:12:47.252325 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.252343 kubelet[2916]: W1216 12:12:47.252335 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.252343 kubelet[2916]: E1216 12:12:47.252342 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.268361 kubelet[2916]: E1216 12:12:47.268319 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.268361 kubelet[2916]: W1216 12:12:47.268347 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.268361 kubelet[2916]: E1216 12:12:47.268365 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.268811 kubelet[2916]: E1216 12:12:47.268776 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.268811 kubelet[2916]: W1216 12:12:47.268797 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.268811 kubelet[2916]: E1216 12:12:47.268810 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.268986 kubelet[2916]: E1216 12:12:47.268966 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.268986 kubelet[2916]: W1216 12:12:47.268976 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.268986 kubelet[2916]: E1216 12:12:47.268986 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.269166 kubelet[2916]: E1216 12:12:47.269144 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.269166 kubelet[2916]: W1216 12:12:47.269156 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.269166 kubelet[2916]: E1216 12:12:47.269165 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.269367 kubelet[2916]: E1216 12:12:47.269351 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.269367 kubelet[2916]: W1216 12:12:47.269363 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.269427 kubelet[2916]: E1216 12:12:47.269373 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.269815 kubelet[2916]: E1216 12:12:47.269785 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.269815 kubelet[2916]: W1216 12:12:47.269799 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.269815 kubelet[2916]: E1216 12:12:47.269810 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.270740 kubelet[2916]: E1216 12:12:47.270712 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.270740 kubelet[2916]: W1216 12:12:47.270729 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.270740 kubelet[2916]: E1216 12:12:47.270742 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.271009 kubelet[2916]: E1216 12:12:47.270992 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.271009 kubelet[2916]: W1216 12:12:47.271005 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.271072 kubelet[2916]: E1216 12:12:47.271018 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.271178 kubelet[2916]: E1216 12:12:47.271163 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.271178 kubelet[2916]: W1216 12:12:47.271178 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.271239 kubelet[2916]: E1216 12:12:47.271187 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.272925 kubelet[2916]: E1216 12:12:47.272901 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.272925 kubelet[2916]: W1216 12:12:47.272917 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.272925 kubelet[2916]: E1216 12:12:47.272928 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.273120 kubelet[2916]: E1216 12:12:47.273102 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.273120 kubelet[2916]: W1216 12:12:47.273115 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.273182 kubelet[2916]: E1216 12:12:47.273124 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.273316 kubelet[2916]: E1216 12:12:47.273289 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.273316 kubelet[2916]: W1216 12:12:47.273302 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.273316 kubelet[2916]: E1216 12:12:47.273311 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.273498 kubelet[2916]: E1216 12:12:47.273475 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.273498 kubelet[2916]: W1216 12:12:47.273489 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.273498 kubelet[2916]: E1216 12:12:47.273499 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.273842 kubelet[2916]: E1216 12:12:47.273820 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.273842 kubelet[2916]: W1216 12:12:47.273836 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.273842 kubelet[2916]: E1216 12:12:47.273850 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.274134 kubelet[2916]: E1216 12:12:47.274119 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.274134 kubelet[2916]: W1216 12:12:47.274130 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.274209 kubelet[2916]: E1216 12:12:47.274139 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.274412 kubelet[2916]: E1216 12:12:47.274399 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.274462 kubelet[2916]: W1216 12:12:47.274410 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.274462 kubelet[2916]: E1216 12:12:47.274442 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.275725 kubelet[2916]: E1216 12:12:47.275699 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.275725 kubelet[2916]: W1216 12:12:47.275719 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.275826 kubelet[2916]: E1216 12:12:47.275732 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:47.275985 kubelet[2916]: E1216 12:12:47.275963 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:47.275985 kubelet[2916]: W1216 12:12:47.275977 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:47.275985 kubelet[2916]: E1216 12:12:47.275988 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.064495 kubelet[2916]: E1216 12:12:48.064425 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cfgbh" podUID="15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4" Dec 16 12:12:48.159990 kubelet[2916]: I1216 12:12:48.159957 2916 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 12:12:48.259939 kubelet[2916]: E1216 12:12:48.259871 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.259939 kubelet[2916]: W1216 12:12:48.259897 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.259939 kubelet[2916]: E1216 12:12:48.259916 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.260381 kubelet[2916]: E1216 12:12:48.260109 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.260381 kubelet[2916]: W1216 12:12:48.260116 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.260381 kubelet[2916]: E1216 12:12:48.260159 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.260381 kubelet[2916]: E1216 12:12:48.260322 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.260381 kubelet[2916]: W1216 12:12:48.260328 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.260381 kubelet[2916]: E1216 12:12:48.260336 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.260509 kubelet[2916]: E1216 12:12:48.260469 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.260509 kubelet[2916]: W1216 12:12:48.260475 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.260509 kubelet[2916]: E1216 12:12:48.260483 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.260662 kubelet[2916]: E1216 12:12:48.260617 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.260662 kubelet[2916]: W1216 12:12:48.260643 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.260662 kubelet[2916]: E1216 12:12:48.260652 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.260809 kubelet[2916]: E1216 12:12:48.260788 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.260809 kubelet[2916]: W1216 12:12:48.260799 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.260809 kubelet[2916]: E1216 12:12:48.260806 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.260949 kubelet[2916]: E1216 12:12:48.260929 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.260949 kubelet[2916]: W1216 12:12:48.260939 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.260949 kubelet[2916]: E1216 12:12:48.260949 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.261088 kubelet[2916]: E1216 12:12:48.261077 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.261088 kubelet[2916]: W1216 12:12:48.261087 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.261139 kubelet[2916]: E1216 12:12:48.261094 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.261235 kubelet[2916]: E1216 12:12:48.261225 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.261259 kubelet[2916]: W1216 12:12:48.261234 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.261259 kubelet[2916]: E1216 12:12:48.261241 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.261368 kubelet[2916]: E1216 12:12:48.261359 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.261392 kubelet[2916]: W1216 12:12:48.261375 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.261392 kubelet[2916]: E1216 12:12:48.261382 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.261496 kubelet[2916]: E1216 12:12:48.261487 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.261517 kubelet[2916]: W1216 12:12:48.261496 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.261517 kubelet[2916]: E1216 12:12:48.261504 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.261625 kubelet[2916]: E1216 12:12:48.261616 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.261666 kubelet[2916]: W1216 12:12:48.261625 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.261666 kubelet[2916]: E1216 12:12:48.261653 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.261796 kubelet[2916]: E1216 12:12:48.261784 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.261820 kubelet[2916]: W1216 12:12:48.261794 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.261820 kubelet[2916]: E1216 12:12:48.261810 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.261990 kubelet[2916]: E1216 12:12:48.261942 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.261990 kubelet[2916]: W1216 12:12:48.261960 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.261990 kubelet[2916]: E1216 12:12:48.261968 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.262093 kubelet[2916]: E1216 12:12:48.262083 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.262093 kubelet[2916]: W1216 12:12:48.262092 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.262137 kubelet[2916]: E1216 12:12:48.262099 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.278732 kubelet[2916]: E1216 12:12:48.278704 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.278732 kubelet[2916]: W1216 12:12:48.278721 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.278732 kubelet[2916]: E1216 12:12:48.278734 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.278951 kubelet[2916]: E1216 12:12:48.278934 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.278951 kubelet[2916]: W1216 12:12:48.278946 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.278998 kubelet[2916]: E1216 12:12:48.278955 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.279171 kubelet[2916]: E1216 12:12:48.279157 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.279171 kubelet[2916]: W1216 12:12:48.279169 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.279231 kubelet[2916]: E1216 12:12:48.279179 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.279391 kubelet[2916]: E1216 12:12:48.279379 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.279391 kubelet[2916]: W1216 12:12:48.279390 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.279453 kubelet[2916]: E1216 12:12:48.279399 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.279593 kubelet[2916]: E1216 12:12:48.279547 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.279593 kubelet[2916]: W1216 12:12:48.279558 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.279593 kubelet[2916]: E1216 12:12:48.279566 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.279732 kubelet[2916]: E1216 12:12:48.279719 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.279755 kubelet[2916]: W1216 12:12:48.279731 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.279755 kubelet[2916]: E1216 12:12:48.279740 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.279969 kubelet[2916]: E1216 12:12:48.279954 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.279969 kubelet[2916]: W1216 12:12:48.279967 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.280027 kubelet[2916]: E1216 12:12:48.279976 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.280281 kubelet[2916]: E1216 12:12:48.280245 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.280281 kubelet[2916]: W1216 12:12:48.280263 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.280281 kubelet[2916]: E1216 12:12:48.280276 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.280442 kubelet[2916]: E1216 12:12:48.280430 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.280476 kubelet[2916]: W1216 12:12:48.280445 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.280476 kubelet[2916]: E1216 12:12:48.280453 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.280594 kubelet[2916]: E1216 12:12:48.280583 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.280594 kubelet[2916]: W1216 12:12:48.280592 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.280708 kubelet[2916]: E1216 12:12:48.280600 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.280749 kubelet[2916]: E1216 12:12:48.280737 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.280749 kubelet[2916]: W1216 12:12:48.280747 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.280799 kubelet[2916]: E1216 12:12:48.280754 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.280931 kubelet[2916]: E1216 12:12:48.280920 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.280931 kubelet[2916]: W1216 12:12:48.280929 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.280981 kubelet[2916]: E1216 12:12:48.280937 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.281095 kubelet[2916]: E1216 12:12:48.281084 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.281095 kubelet[2916]: W1216 12:12:48.281094 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.281148 kubelet[2916]: E1216 12:12:48.281101 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.281333 kubelet[2916]: E1216 12:12:48.281318 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.281366 kubelet[2916]: W1216 12:12:48.281333 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.281366 kubelet[2916]: E1216 12:12:48.281347 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.281499 kubelet[2916]: E1216 12:12:48.281486 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.281499 kubelet[2916]: W1216 12:12:48.281496 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.281547 kubelet[2916]: E1216 12:12:48.281504 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.281701 kubelet[2916]: E1216 12:12:48.281690 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.281701 kubelet[2916]: W1216 12:12:48.281700 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.281750 kubelet[2916]: E1216 12:12:48.281709 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.281942 kubelet[2916]: E1216 12:12:48.281927 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.281976 kubelet[2916]: W1216 12:12:48.281941 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.281976 kubelet[2916]: E1216 12:12:48.281951 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.282140 kubelet[2916]: E1216 12:12:48.282127 2916 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:12:48.282140 kubelet[2916]: W1216 12:12:48.282138 2916 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:12:48.282185 kubelet[2916]: E1216 12:12:48.282146 2916 plugins.go:703] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:12:48.643950 containerd[1662]: time="2025-12-16T12:12:48.643822350Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:12:48.644720 containerd[1662]: time="2025-12-16T12:12:48.644676872Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=4262566" Dec 16 12:12:48.645729 containerd[1662]: time="2025-12-16T12:12:48.645706875Z" level=info msg="ImageCreate event name:\"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:12:48.647781 containerd[1662]: time="2025-12-16T12:12:48.647738680Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:12:48.648336 containerd[1662]: time="2025-12-16T12:12:48.648300642Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5636392\" in 1.655832171s" Dec 16 12:12:48.648336 containerd[1662]: time="2025-12-16T12:12:48.648328722Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\"" Dec 16 12:12:48.654104 containerd[1662]: time="2025-12-16T12:12:48.654048498Z" level=info msg="CreateContainer within sandbox \"23101e0844872f00e9017a5c42b84400c103cd2ac51869df9e22ed63a084429f\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Dec 16 12:12:48.665752 containerd[1662]: time="2025-12-16T12:12:48.665712851Z" level=info msg="Container ebdcbbb785ce3f1444031b5622681300796a50f477246b7843a128d1b5400c18: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:12:48.673555 containerd[1662]: time="2025-12-16T12:12:48.673495712Z" level=info msg="CreateContainer within sandbox \"23101e0844872f00e9017a5c42b84400c103cd2ac51869df9e22ed63a084429f\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"ebdcbbb785ce3f1444031b5622681300796a50f477246b7843a128d1b5400c18\"" Dec 16 12:12:48.674545 containerd[1662]: time="2025-12-16T12:12:48.674107594Z" level=info msg="StartContainer for \"ebdcbbb785ce3f1444031b5622681300796a50f477246b7843a128d1b5400c18\"" Dec 16 12:12:48.676012 containerd[1662]: time="2025-12-16T12:12:48.675962679Z" level=info msg="connecting to shim ebdcbbb785ce3f1444031b5622681300796a50f477246b7843a128d1b5400c18" address="unix:///run/containerd/s/511a2f4a7b69e5631dafd6f38394ed92a975aae1789a368dee9c5c5b455a0e0e" protocol=ttrpc version=3 Dec 16 12:12:48.695846 systemd[1]: Started cri-containerd-ebdcbbb785ce3f1444031b5622681300796a50f477246b7843a128d1b5400c18.scope - libcontainer container ebdcbbb785ce3f1444031b5622681300796a50f477246b7843a128d1b5400c18. Dec 16 12:12:48.750000 audit: BPF prog-id=166 op=LOAD Dec 16 12:12:48.750000 audit[3666]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3478 pid=3666 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:48.750000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562646362626237383563653366313434343033316235363232363831 Dec 16 12:12:48.750000 audit: BPF prog-id=167 op=LOAD Dec 16 12:12:48.750000 audit[3666]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3478 pid=3666 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:48.750000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562646362626237383563653366313434343033316235363232363831 Dec 16 12:12:48.750000 audit: BPF prog-id=167 op=UNLOAD Dec 16 12:12:48.750000 audit[3666]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3478 pid=3666 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:48.750000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562646362626237383563653366313434343033316235363232363831 Dec 16 12:12:48.750000 audit: BPF prog-id=166 op=UNLOAD Dec 16 12:12:48.750000 audit[3666]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3478 pid=3666 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:48.750000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562646362626237383563653366313434343033316235363232363831 Dec 16 12:12:48.750000 audit: BPF prog-id=168 op=LOAD Dec 16 12:12:48.750000 audit[3666]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=3478 pid=3666 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:48.750000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6562646362626237383563653366313434343033316235363232363831 Dec 16 12:12:48.771078 containerd[1662]: time="2025-12-16T12:12:48.770963624Z" level=info msg="StartContainer for \"ebdcbbb785ce3f1444031b5622681300796a50f477246b7843a128d1b5400c18\" returns successfully" Dec 16 12:12:48.783558 systemd[1]: cri-containerd-ebdcbbb785ce3f1444031b5622681300796a50f477246b7843a128d1b5400c18.scope: Deactivated successfully. Dec 16 12:12:48.787905 containerd[1662]: time="2025-12-16T12:12:48.787855791Z" level=info msg="received container exit event container_id:\"ebdcbbb785ce3f1444031b5622681300796a50f477246b7843a128d1b5400c18\" id:\"ebdcbbb785ce3f1444031b5622681300796a50f477246b7843a128d1b5400c18\" pid:3679 exited_at:{seconds:1765887168 nanos:787404109}" Dec 16 12:12:48.789000 audit: BPF prog-id=168 op=UNLOAD Dec 16 12:12:48.807591 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-ebdcbbb785ce3f1444031b5622681300796a50f477246b7843a128d1b5400c18-rootfs.mount: Deactivated successfully. Dec 16 12:12:49.165290 containerd[1662]: time="2025-12-16T12:12:49.164868681Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Dec 16 12:12:50.064477 kubelet[2916]: E1216 12:12:50.064367 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cfgbh" podUID="15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4" Dec 16 12:12:52.063610 kubelet[2916]: E1216 12:12:52.063533 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-cfgbh" podUID="15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4" Dec 16 12:12:52.569324 containerd[1662]: time="2025-12-16T12:12:52.569269642Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:12:52.570582 containerd[1662]: time="2025-12-16T12:12:52.570541245Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=65921248" Dec 16 12:12:52.571761 containerd[1662]: time="2025-12-16T12:12:52.571695608Z" level=info msg="ImageCreate event name:\"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:12:52.573858 containerd[1662]: time="2025-12-16T12:12:52.573809774Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:12:52.574372 containerd[1662]: time="2025-12-16T12:12:52.574343136Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"67295507\" in 3.409435735s" Dec 16 12:12:52.574415 containerd[1662]: time="2025-12-16T12:12:52.574377176Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\"" Dec 16 12:12:52.577977 containerd[1662]: time="2025-12-16T12:12:52.577944826Z" level=info msg="CreateContainer within sandbox \"23101e0844872f00e9017a5c42b84400c103cd2ac51869df9e22ed63a084429f\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Dec 16 12:12:52.587208 containerd[1662]: time="2025-12-16T12:12:52.587159291Z" level=info msg="Container 573eb88610d39cfbf45fbf2a2dbefb74a7fdccb745999d09306f78ec11a1fe27: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:12:52.596214 containerd[1662]: time="2025-12-16T12:12:52.596169076Z" level=info msg="CreateContainer within sandbox \"23101e0844872f00e9017a5c42b84400c103cd2ac51869df9e22ed63a084429f\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"573eb88610d39cfbf45fbf2a2dbefb74a7fdccb745999d09306f78ec11a1fe27\"" Dec 16 12:12:52.598669 containerd[1662]: time="2025-12-16T12:12:52.596698278Z" level=info msg="StartContainer for \"573eb88610d39cfbf45fbf2a2dbefb74a7fdccb745999d09306f78ec11a1fe27\"" Dec 16 12:12:52.599388 containerd[1662]: time="2025-12-16T12:12:52.599350925Z" level=info msg="connecting to shim 573eb88610d39cfbf45fbf2a2dbefb74a7fdccb745999d09306f78ec11a1fe27" address="unix:///run/containerd/s/511a2f4a7b69e5631dafd6f38394ed92a975aae1789a368dee9c5c5b455a0e0e" protocol=ttrpc version=3 Dec 16 12:12:52.619848 systemd[1]: Started cri-containerd-573eb88610d39cfbf45fbf2a2dbefb74a7fdccb745999d09306f78ec11a1fe27.scope - libcontainer container 573eb88610d39cfbf45fbf2a2dbefb74a7fdccb745999d09306f78ec11a1fe27. Dec 16 12:12:52.686656 kernel: kauditd_printk_skb: 56 callbacks suppressed Dec 16 12:12:52.686765 kernel: audit: type=1334 audit(1765887172.683:569): prog-id=169 op=LOAD Dec 16 12:12:52.686789 kernel: audit: type=1300 audit(1765887172.683:569): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3478 pid=3727 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:52.683000 audit: BPF prog-id=169 op=LOAD Dec 16 12:12:52.683000 audit[3727]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3478 pid=3727 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:52.683000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537336562383836313064333963666266343566626632613264626566 Dec 16 12:12:52.692246 kernel: audit: type=1327 audit(1765887172.683:569): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537336562383836313064333963666266343566626632613264626566 Dec 16 12:12:52.692299 kernel: audit: type=1334 audit(1765887172.684:570): prog-id=170 op=LOAD Dec 16 12:12:52.684000 audit: BPF prog-id=170 op=LOAD Dec 16 12:12:52.684000 audit[3727]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3478 pid=3727 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:52.696553 kernel: audit: type=1300 audit(1765887172.684:570): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3478 pid=3727 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:52.696692 kernel: audit: type=1327 audit(1765887172.684:570): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537336562383836313064333963666266343566626632613264626566 Dec 16 12:12:52.684000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537336562383836313064333963666266343566626632613264626566 Dec 16 12:12:52.688000 audit: BPF prog-id=170 op=UNLOAD Dec 16 12:12:52.700915 kernel: audit: type=1334 audit(1765887172.688:571): prog-id=170 op=UNLOAD Dec 16 12:12:52.700953 kernel: audit: type=1300 audit(1765887172.688:571): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3478 pid=3727 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:52.688000 audit[3727]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3478 pid=3727 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:52.688000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537336562383836313064333963666266343566626632613264626566 Dec 16 12:12:52.707581 kernel: audit: type=1327 audit(1765887172.688:571): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537336562383836313064333963666266343566626632613264626566 Dec 16 12:12:52.688000 audit: BPF prog-id=169 op=UNLOAD Dec 16 12:12:52.708650 kernel: audit: type=1334 audit(1765887172.688:572): prog-id=169 op=UNLOAD Dec 16 12:12:52.688000 audit[3727]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3478 pid=3727 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:52.688000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537336562383836313064333963666266343566626632613264626566 Dec 16 12:12:52.688000 audit: BPF prog-id=171 op=LOAD Dec 16 12:12:52.688000 audit[3727]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3478 pid=3727 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:12:52.688000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3537336562383836313064333963666266343566626632613264626566 Dec 16 12:12:52.725557 containerd[1662]: time="2025-12-16T12:12:52.725517957Z" level=info msg="StartContainer for \"573eb88610d39cfbf45fbf2a2dbefb74a7fdccb745999d09306f78ec11a1fe27\" returns successfully" Dec 16 12:12:53.272158 containerd[1662]: time="2025-12-16T12:12:53.271768118Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 16 12:12:53.274908 systemd[1]: cri-containerd-573eb88610d39cfbf45fbf2a2dbefb74a7fdccb745999d09306f78ec11a1fe27.scope: Deactivated successfully. Dec 16 12:12:53.275716 containerd[1662]: time="2025-12-16T12:12:53.275112487Z" level=info msg="received container exit event container_id:\"573eb88610d39cfbf45fbf2a2dbefb74a7fdccb745999d09306f78ec11a1fe27\" id:\"573eb88610d39cfbf45fbf2a2dbefb74a7fdccb745999d09306f78ec11a1fe27\" pid:3740 exited_at:{seconds:1765887173 nanos:274720526}" Dec 16 12:12:53.275217 systemd[1]: cri-containerd-573eb88610d39cfbf45fbf2a2dbefb74a7fdccb745999d09306f78ec11a1fe27.scope: Consumed 456ms CPU time, 186.8M memory peak, 165.9M written to disk. Dec 16 12:12:53.280000 audit: BPF prog-id=171 op=UNLOAD Dec 16 12:12:53.298094 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-573eb88610d39cfbf45fbf2a2dbefb74a7fdccb745999d09306f78ec11a1fe27-rootfs.mount: Deactivated successfully. Dec 16 12:12:53.357072 kubelet[2916]: I1216 12:12:53.357045 2916 kubelet_node_status.go:501] "Fast updating node status as it just became ready" Dec 16 12:12:53.397964 systemd[1]: Created slice kubepods-burstable-pod78a6a3e9_dded_4049_98d3_1cf9591828bf.slice - libcontainer container kubepods-burstable-pod78a6a3e9_dded_4049_98d3_1cf9591828bf.slice. Dec 16 12:12:53.412886 systemd[1]: Created slice kubepods-burstable-pod7883990b_d1d6_4772_8df2_71ee5dd1032b.slice - libcontainer container kubepods-burstable-pod7883990b_d1d6_4772_8df2_71ee5dd1032b.slice. Dec 16 12:12:53.414269 kubelet[2916]: I1216 12:12:53.414072 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/fd2217b9-d056-43fe-a117-755ac4fcec22-calico-apiserver-certs\") pod \"calico-apiserver-585b8759c9-djtxq\" (UID: \"fd2217b9-d056-43fe-a117-755ac4fcec22\") " pod="calico-apiserver/calico-apiserver-585b8759c9-djtxq" Dec 16 12:12:53.414269 kubelet[2916]: I1216 12:12:53.414108 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2qnzc\" (UniqueName: \"kubernetes.io/projected/7883990b-d1d6-4772-8df2-71ee5dd1032b-kube-api-access-2qnzc\") pod \"coredns-674b8bbfcf-tdcxp\" (UID: \"7883990b-d1d6-4772-8df2-71ee5dd1032b\") " pod="kube-system/coredns-674b8bbfcf-tdcxp" Dec 16 12:12:53.414269 kubelet[2916]: I1216 12:12:53.414126 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jnw7g\" (UniqueName: \"kubernetes.io/projected/fd2217b9-d056-43fe-a117-755ac4fcec22-kube-api-access-jnw7g\") pod \"calico-apiserver-585b8759c9-djtxq\" (UID: \"fd2217b9-d056-43fe-a117-755ac4fcec22\") " pod="calico-apiserver/calico-apiserver-585b8759c9-djtxq" Dec 16 12:12:53.414269 kubelet[2916]: I1216 12:12:53.414144 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xgkkb\" (UniqueName: \"kubernetes.io/projected/9aa83703-a98a-4b61-a3ff-9d2f2ed50abd-kube-api-access-xgkkb\") pod \"calico-kube-controllers-5d8494b5b4-8w8m7\" (UID: \"9aa83703-a98a-4b61-a3ff-9d2f2ed50abd\") " pod="calico-system/calico-kube-controllers-5d8494b5b4-8w8m7" Dec 16 12:12:53.414269 kubelet[2916]: I1216 12:12:53.414165 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/78a6a3e9-dded-4049-98d3-1cf9591828bf-config-volume\") pod \"coredns-674b8bbfcf-jp97r\" (UID: \"78a6a3e9-dded-4049-98d3-1cf9591828bf\") " pod="kube-system/coredns-674b8bbfcf-jp97r" Dec 16 12:12:53.414459 kubelet[2916]: I1216 12:12:53.414183 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w2j9t\" (UniqueName: \"kubernetes.io/projected/78a6a3e9-dded-4049-98d3-1cf9591828bf-kube-api-access-w2j9t\") pod \"coredns-674b8bbfcf-jp97r\" (UID: \"78a6a3e9-dded-4049-98d3-1cf9591828bf\") " pod="kube-system/coredns-674b8bbfcf-jp97r" Dec 16 12:12:53.414459 kubelet[2916]: I1216 12:12:53.414202 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/7883990b-d1d6-4772-8df2-71ee5dd1032b-config-volume\") pod \"coredns-674b8bbfcf-tdcxp\" (UID: \"7883990b-d1d6-4772-8df2-71ee5dd1032b\") " pod="kube-system/coredns-674b8bbfcf-tdcxp" Dec 16 12:12:53.414459 kubelet[2916]: I1216 12:12:53.414218 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9aa83703-a98a-4b61-a3ff-9d2f2ed50abd-tigera-ca-bundle\") pod \"calico-kube-controllers-5d8494b5b4-8w8m7\" (UID: \"9aa83703-a98a-4b61-a3ff-9d2f2ed50abd\") " pod="calico-system/calico-kube-controllers-5d8494b5b4-8w8m7" Dec 16 12:12:53.422209 systemd[1]: Created slice kubepods-besteffort-pod9aa83703_a98a_4b61_a3ff_9d2f2ed50abd.slice - libcontainer container kubepods-besteffort-pod9aa83703_a98a_4b61_a3ff_9d2f2ed50abd.slice. Dec 16 12:12:53.428933 systemd[1]: Created slice kubepods-besteffort-podfd2217b9_d056_43fe_a117_755ac4fcec22.slice - libcontainer container kubepods-besteffort-podfd2217b9_d056_43fe_a117_755ac4fcec22.slice. Dec 16 12:12:53.434715 systemd[1]: Created slice kubepods-besteffort-pod908b1695_a850_4afd_b4ae_bc3924c46a24.slice - libcontainer container kubepods-besteffort-pod908b1695_a850_4afd_b4ae_bc3924c46a24.slice. Dec 16 12:12:53.441911 systemd[1]: Created slice kubepods-besteffort-pode4dab5b6_c490_4927_98c0_45033692e43c.slice - libcontainer container kubepods-besteffort-pode4dab5b6_c490_4927_98c0_45033692e43c.slice. Dec 16 12:12:53.445832 systemd[1]: Created slice kubepods-besteffort-poda0cf5bad_dc9b_4b29_b409_00ae25b08c7c.slice - libcontainer container kubepods-besteffort-poda0cf5bad_dc9b_4b29_b409_00ae25b08c7c.slice. Dec 16 12:12:53.515125 kubelet[2916]: I1216 12:12:53.515023 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/a0cf5bad-dc9b-4b29-b409-00ae25b08c7c-goldmane-key-pair\") pod \"goldmane-666569f655-7qscd\" (UID: \"a0cf5bad-dc9b-4b29-b409-00ae25b08c7c\") " pod="calico-system/goldmane-666569f655-7qscd" Dec 16 12:12:53.515125 kubelet[2916]: I1216 12:12:53.515070 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a0cf5bad-dc9b-4b29-b409-00ae25b08c7c-config\") pod \"goldmane-666569f655-7qscd\" (UID: \"a0cf5bad-dc9b-4b29-b409-00ae25b08c7c\") " pod="calico-system/goldmane-666569f655-7qscd" Dec 16 12:12:53.515125 kubelet[2916]: I1216 12:12:53.515090 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rjg7b\" (UniqueName: \"kubernetes.io/projected/908b1695-a850-4afd-b4ae-bc3924c46a24-kube-api-access-rjg7b\") pod \"whisker-6c69d4f66c-hblck\" (UID: \"908b1695-a850-4afd-b4ae-bc3924c46a24\") " pod="calico-system/whisker-6c69d4f66c-hblck" Dec 16 12:12:53.515400 kubelet[2916]: I1216 12:12:53.515193 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/e4dab5b6-c490-4927-98c0-45033692e43c-calico-apiserver-certs\") pod \"calico-apiserver-585b8759c9-ggqjc\" (UID: \"e4dab5b6-c490-4927-98c0-45033692e43c\") " pod="calico-apiserver/calico-apiserver-585b8759c9-ggqjc" Dec 16 12:12:53.515400 kubelet[2916]: I1216 12:12:53.515249 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/908b1695-a850-4afd-b4ae-bc3924c46a24-whisker-backend-key-pair\") pod \"whisker-6c69d4f66c-hblck\" (UID: \"908b1695-a850-4afd-b4ae-bc3924c46a24\") " pod="calico-system/whisker-6c69d4f66c-hblck" Dec 16 12:12:53.515400 kubelet[2916]: I1216 12:12:53.515285 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a0cf5bad-dc9b-4b29-b409-00ae25b08c7c-goldmane-ca-bundle\") pod \"goldmane-666569f655-7qscd\" (UID: \"a0cf5bad-dc9b-4b29-b409-00ae25b08c7c\") " pod="calico-system/goldmane-666569f655-7qscd" Dec 16 12:12:53.515400 kubelet[2916]: I1216 12:12:53.515303 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5cprm\" (UniqueName: \"kubernetes.io/projected/a0cf5bad-dc9b-4b29-b409-00ae25b08c7c-kube-api-access-5cprm\") pod \"goldmane-666569f655-7qscd\" (UID: \"a0cf5bad-dc9b-4b29-b409-00ae25b08c7c\") " pod="calico-system/goldmane-666569f655-7qscd" Dec 16 12:12:53.515400 kubelet[2916]: I1216 12:12:53.515340 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsckl\" (UniqueName: \"kubernetes.io/projected/e4dab5b6-c490-4927-98c0-45033692e43c-kube-api-access-hsckl\") pod \"calico-apiserver-585b8759c9-ggqjc\" (UID: \"e4dab5b6-c490-4927-98c0-45033692e43c\") " pod="calico-apiserver/calico-apiserver-585b8759c9-ggqjc" Dec 16 12:12:53.515539 kubelet[2916]: I1216 12:12:53.515354 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/908b1695-a850-4afd-b4ae-bc3924c46a24-whisker-ca-bundle\") pod \"whisker-6c69d4f66c-hblck\" (UID: \"908b1695-a850-4afd-b4ae-bc3924c46a24\") " pod="calico-system/whisker-6c69d4f66c-hblck" Dec 16 12:12:53.704280 containerd[1662]: time="2025-12-16T12:12:53.704226922Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-jp97r,Uid:78a6a3e9-dded-4049-98d3-1cf9591828bf,Namespace:kube-system,Attempt:0,}" Dec 16 12:12:53.718001 containerd[1662]: time="2025-12-16T12:12:53.717967121Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-tdcxp,Uid:7883990b-d1d6-4772-8df2-71ee5dd1032b,Namespace:kube-system,Attempt:0,}" Dec 16 12:12:53.727402 containerd[1662]: time="2025-12-16T12:12:53.727225466Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5d8494b5b4-8w8m7,Uid:9aa83703-a98a-4b61-a3ff-9d2f2ed50abd,Namespace:calico-system,Attempt:0,}" Dec 16 12:12:53.733050 containerd[1662]: time="2025-12-16T12:12:53.732999082Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-585b8759c9-djtxq,Uid:fd2217b9-d056-43fe-a117-755ac4fcec22,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:12:53.788855 containerd[1662]: time="2025-12-16T12:12:53.788796038Z" level=error msg="Failed to destroy network for sandbox \"88be98bac221e0134f2ab567f2f63ddd7fbfed0f1b105defacfc4e4470528487\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:12:53.791692 containerd[1662]: time="2025-12-16T12:12:53.791600206Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-tdcxp,Uid:7883990b-d1d6-4772-8df2-71ee5dd1032b,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"88be98bac221e0134f2ab567f2f63ddd7fbfed0f1b105defacfc4e4470528487\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:12:53.791929 kubelet[2916]: E1216 12:12:53.791885 2916 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"88be98bac221e0134f2ab567f2f63ddd7fbfed0f1b105defacfc4e4470528487\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:12:53.791977 kubelet[2916]: E1216 12:12:53.791960 2916 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"88be98bac221e0134f2ab567f2f63ddd7fbfed0f1b105defacfc4e4470528487\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-tdcxp" Dec 16 12:12:53.792002 kubelet[2916]: E1216 12:12:53.791980 2916 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"88be98bac221e0134f2ab567f2f63ddd7fbfed0f1b105defacfc4e4470528487\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-tdcxp" Dec 16 12:12:53.792060 kubelet[2916]: E1216 12:12:53.792036 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-tdcxp_kube-system(7883990b-d1d6-4772-8df2-71ee5dd1032b)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-tdcxp_kube-system(7883990b-d1d6-4772-8df2-71ee5dd1032b)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"88be98bac221e0134f2ab567f2f63ddd7fbfed0f1b105defacfc4e4470528487\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-tdcxp" podUID="7883990b-d1d6-4772-8df2-71ee5dd1032b" Dec 16 12:12:53.792655 containerd[1662]: time="2025-12-16T12:12:53.792574768Z" level=error msg="Failed to destroy network for sandbox \"56b62697aa8509c96105db5b461fc1a6c9678015ddd65af0e733c5bf208e5d71\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:12:53.795052 containerd[1662]: time="2025-12-16T12:12:53.795010895Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-jp97r,Uid:78a6a3e9-dded-4049-98d3-1cf9591828bf,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"56b62697aa8509c96105db5b461fc1a6c9678015ddd65af0e733c5bf208e5d71\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:12:53.795303 kubelet[2916]: E1216 12:12:53.795232 2916 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"56b62697aa8509c96105db5b461fc1a6c9678015ddd65af0e733c5bf208e5d71\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:12:53.795359 kubelet[2916]: E1216 12:12:53.795317 2916 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"56b62697aa8509c96105db5b461fc1a6c9678015ddd65af0e733c5bf208e5d71\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-jp97r" Dec 16 12:12:53.795359 kubelet[2916]: E1216 12:12:53.795345 2916 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"56b62697aa8509c96105db5b461fc1a6c9678015ddd65af0e733c5bf208e5d71\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-674b8bbfcf-jp97r" Dec 16 12:12:53.795451 kubelet[2916]: E1216 12:12:53.795396 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-674b8bbfcf-jp97r_kube-system(78a6a3e9-dded-4049-98d3-1cf9591828bf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-674b8bbfcf-jp97r_kube-system(78a6a3e9-dded-4049-98d3-1cf9591828bf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"56b62697aa8509c96105db5b461fc1a6c9678015ddd65af0e733c5bf208e5d71\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-674b8bbfcf-jp97r" podUID="78a6a3e9-dded-4049-98d3-1cf9591828bf" Dec 16 12:12:53.802549 containerd[1662]: time="2025-12-16T12:12:53.802488476Z" level=error msg="Failed to destroy network for sandbox \"98a9c8c492bef80999f88739291fa00c4ccf82ad4f1aaaadb6c878d51a704a96\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:12:53.804882 containerd[1662]: time="2025-12-16T12:12:53.804783762Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5d8494b5b4-8w8m7,Uid:9aa83703-a98a-4b61-a3ff-9d2f2ed50abd,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"98a9c8c492bef80999f88739291fa00c4ccf82ad4f1aaaadb6c878d51a704a96\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:12:53.805069 kubelet[2916]: E1216 12:12:53.805005 2916 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"98a9c8c492bef80999f88739291fa00c4ccf82ad4f1aaaadb6c878d51a704a96\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:12:53.805121 kubelet[2916]: E1216 12:12:53.805080 2916 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"98a9c8c492bef80999f88739291fa00c4ccf82ad4f1aaaadb6c878d51a704a96\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5d8494b5b4-8w8m7" Dec 16 12:12:53.805121 kubelet[2916]: E1216 12:12:53.805099 2916 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"98a9c8c492bef80999f88739291fa00c4ccf82ad4f1aaaadb6c878d51a704a96\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-5d8494b5b4-8w8m7" Dec 16 12:12:53.805170 kubelet[2916]: E1216 12:12:53.805151 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-5d8494b5b4-8w8m7_calico-system(9aa83703-a98a-4b61-a3ff-9d2f2ed50abd)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-5d8494b5b4-8w8m7_calico-system(9aa83703-a98a-4b61-a3ff-9d2f2ed50abd)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"98a9c8c492bef80999f88739291fa00c4ccf82ad4f1aaaadb6c878d51a704a96\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-5d8494b5b4-8w8m7" podUID="9aa83703-a98a-4b61-a3ff-9d2f2ed50abd" Dec 16 12:12:53.809529 containerd[1662]: time="2025-12-16T12:12:53.809493015Z" level=error msg="Failed to destroy network for sandbox \"3d3f0708309aba6556170e734c3c494e949f5a0db0c3f338e2b2281a1bac2a5a\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:12:53.811673 containerd[1662]: time="2025-12-16T12:12:53.811642941Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-585b8759c9-djtxq,Uid:fd2217b9-d056-43fe-a117-755ac4fcec22,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"3d3f0708309aba6556170e734c3c494e949f5a0db0c3f338e2b2281a1bac2a5a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:12:53.811900 kubelet[2916]: E1216 12:12:53.811844 2916 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3d3f0708309aba6556170e734c3c494e949f5a0db0c3f338e2b2281a1bac2a5a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:12:53.811945 kubelet[2916]: E1216 12:12:53.811912 2916 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3d3f0708309aba6556170e734c3c494e949f5a0db0c3f338e2b2281a1bac2a5a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-585b8759c9-djtxq" Dec 16 12:12:53.811945 kubelet[2916]: E1216 12:12:53.811929 2916 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"3d3f0708309aba6556170e734c3c494e949f5a0db0c3f338e2b2281a1bac2a5a\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-585b8759c9-djtxq" Dec 16 12:12:53.812007 kubelet[2916]: E1216 12:12:53.811981 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-585b8759c9-djtxq_calico-apiserver(fd2217b9-d056-43fe-a117-755ac4fcec22)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-585b8759c9-djtxq_calico-apiserver(fd2217b9-d056-43fe-a117-755ac4fcec22)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"3d3f0708309aba6556170e734c3c494e949f5a0db0c3f338e2b2281a1bac2a5a\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-585b8759c9-djtxq" podUID="fd2217b9-d056-43fe-a117-755ac4fcec22" Dec 16 12:12:54.038479 containerd[1662]: time="2025-12-16T12:12:54.038343773Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6c69d4f66c-hblck,Uid:908b1695-a850-4afd-b4ae-bc3924c46a24,Namespace:calico-system,Attempt:0,}" Dec 16 12:12:54.045061 containerd[1662]: time="2025-12-16T12:12:54.045026151Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-585b8759c9-ggqjc,Uid:e4dab5b6-c490-4927-98c0-45033692e43c,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:12:54.049111 containerd[1662]: time="2025-12-16T12:12:54.049073483Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-7qscd,Uid:a0cf5bad-dc9b-4b29-b409-00ae25b08c7c,Namespace:calico-system,Attempt:0,}" Dec 16 12:12:54.070023 systemd[1]: Created slice kubepods-besteffort-pod15e1938f_0bd9_43f1_a8b8_ecedb5c4b1c4.slice - libcontainer container kubepods-besteffort-pod15e1938f_0bd9_43f1_a8b8_ecedb5c4b1c4.slice. Dec 16 12:12:54.073223 containerd[1662]: time="2025-12-16T12:12:54.073192150Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cfgbh,Uid:15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4,Namespace:calico-system,Attempt:0,}" Dec 16 12:12:54.094036 containerd[1662]: time="2025-12-16T12:12:54.093822967Z" level=error msg="Failed to destroy network for sandbox \"2f56b7ec676c6d941f89a2a1f34fe0f03dc417744d621f9a4bbe38168ab8e161\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:12:54.097792 containerd[1662]: time="2025-12-16T12:12:54.097725058Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-6c69d4f66c-hblck,Uid:908b1695-a850-4afd-b4ae-bc3924c46a24,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2f56b7ec676c6d941f89a2a1f34fe0f03dc417744d621f9a4bbe38168ab8e161\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:12:54.099070 kubelet[2916]: E1216 12:12:54.098720 2916 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2f56b7ec676c6d941f89a2a1f34fe0f03dc417744d621f9a4bbe38168ab8e161\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:12:54.099070 kubelet[2916]: E1216 12:12:54.098937 2916 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2f56b7ec676c6d941f89a2a1f34fe0f03dc417744d621f9a4bbe38168ab8e161\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6c69d4f66c-hblck" Dec 16 12:12:54.099070 kubelet[2916]: E1216 12:12:54.098962 2916 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2f56b7ec676c6d941f89a2a1f34fe0f03dc417744d621f9a4bbe38168ab8e161\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-6c69d4f66c-hblck" Dec 16 12:12:54.099463 kubelet[2916]: E1216 12:12:54.099302 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-6c69d4f66c-hblck_calico-system(908b1695-a850-4afd-b4ae-bc3924c46a24)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-6c69d4f66c-hblck_calico-system(908b1695-a850-4afd-b4ae-bc3924c46a24)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2f56b7ec676c6d941f89a2a1f34fe0f03dc417744d621f9a4bbe38168ab8e161\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-6c69d4f66c-hblck" podUID="908b1695-a850-4afd-b4ae-bc3924c46a24" Dec 16 12:12:54.117739 containerd[1662]: time="2025-12-16T12:12:54.117218952Z" level=error msg="Failed to destroy network for sandbox \"d408f1c69fcd0e52a0010d58548774088b2f45300e905c45b970b87c5bf1dda8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:12:54.121953 containerd[1662]: time="2025-12-16T12:12:54.121775245Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-585b8759c9-ggqjc,Uid:e4dab5b6-c490-4927-98c0-45033692e43c,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d408f1c69fcd0e52a0010d58548774088b2f45300e905c45b970b87c5bf1dda8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:12:54.122569 kubelet[2916]: E1216 12:12:54.122529 2916 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d408f1c69fcd0e52a0010d58548774088b2f45300e905c45b970b87c5bf1dda8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:12:54.123566 kubelet[2916]: E1216 12:12:54.122700 2916 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d408f1c69fcd0e52a0010d58548774088b2f45300e905c45b970b87c5bf1dda8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-585b8759c9-ggqjc" Dec 16 12:12:54.123566 kubelet[2916]: E1216 12:12:54.122730 2916 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d408f1c69fcd0e52a0010d58548774088b2f45300e905c45b970b87c5bf1dda8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-585b8759c9-ggqjc" Dec 16 12:12:54.123566 kubelet[2916]: E1216 12:12:54.122781 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-585b8759c9-ggqjc_calico-apiserver(e4dab5b6-c490-4927-98c0-45033692e43c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-585b8759c9-ggqjc_calico-apiserver(e4dab5b6-c490-4927-98c0-45033692e43c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d408f1c69fcd0e52a0010d58548774088b2f45300e905c45b970b87c5bf1dda8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-585b8759c9-ggqjc" podUID="e4dab5b6-c490-4927-98c0-45033692e43c" Dec 16 12:12:54.123873 containerd[1662]: time="2025-12-16T12:12:54.123840451Z" level=error msg="Failed to destroy network for sandbox \"92243ae6158462239a52344dcc719fdba043921ed9470a0b1aba9231e1c3b124\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:12:54.128807 containerd[1662]: time="2025-12-16T12:12:54.128731625Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-7qscd,Uid:a0cf5bad-dc9b-4b29-b409-00ae25b08c7c,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"92243ae6158462239a52344dcc719fdba043921ed9470a0b1aba9231e1c3b124\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:12:54.129109 kubelet[2916]: E1216 12:12:54.129077 2916 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"92243ae6158462239a52344dcc719fdba043921ed9470a0b1aba9231e1c3b124\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:12:54.129208 kubelet[2916]: E1216 12:12:54.129193 2916 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"92243ae6158462239a52344dcc719fdba043921ed9470a0b1aba9231e1c3b124\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-7qscd" Dec 16 12:12:54.130096 kubelet[2916]: E1216 12:12:54.129263 2916 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"92243ae6158462239a52344dcc719fdba043921ed9470a0b1aba9231e1c3b124\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-666569f655-7qscd" Dec 16 12:12:54.130096 kubelet[2916]: E1216 12:12:54.129320 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-666569f655-7qscd_calico-system(a0cf5bad-dc9b-4b29-b409-00ae25b08c7c)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-666569f655-7qscd_calico-system(a0cf5bad-dc9b-4b29-b409-00ae25b08c7c)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"92243ae6158462239a52344dcc719fdba043921ed9470a0b1aba9231e1c3b124\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-666569f655-7qscd" podUID="a0cf5bad-dc9b-4b29-b409-00ae25b08c7c" Dec 16 12:12:54.139937 containerd[1662]: time="2025-12-16T12:12:54.139898176Z" level=error msg="Failed to destroy network for sandbox \"9b6ae07d91ee3e714093242627bdee1804a914972c9b23a932c94e68de410e1c\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:12:54.142302 containerd[1662]: time="2025-12-16T12:12:54.142271822Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cfgbh,Uid:15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b6ae07d91ee3e714093242627bdee1804a914972c9b23a932c94e68de410e1c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:12:54.142677 kubelet[2916]: E1216 12:12:54.142505 2916 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b6ae07d91ee3e714093242627bdee1804a914972c9b23a932c94e68de410e1c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:12:54.142677 kubelet[2916]: E1216 12:12:54.142559 2916 kuberuntime_sandbox.go:70] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b6ae07d91ee3e714093242627bdee1804a914972c9b23a932c94e68de410e1c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cfgbh" Dec 16 12:12:54.142677 kubelet[2916]: E1216 12:12:54.142577 2916 kuberuntime_manager.go:1252] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9b6ae07d91ee3e714093242627bdee1804a914972c9b23a932c94e68de410e1c\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-cfgbh" Dec 16 12:12:54.142797 kubelet[2916]: E1216 12:12:54.142621 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-cfgbh_calico-system(15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-cfgbh_calico-system(15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9b6ae07d91ee3e714093242627bdee1804a914972c9b23a932c94e68de410e1c\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-cfgbh" podUID="15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4" Dec 16 12:12:54.181502 containerd[1662]: time="2025-12-16T12:12:54.181215411Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Dec 16 12:12:54.591372 systemd[1]: run-netns-cni\x2d5006545e\x2d7958\x2dca98\x2d009a\x2d56e927b32295.mount: Deactivated successfully. Dec 16 12:12:54.591457 systemd[1]: run-netns-cni\x2d236b7f06\x2d0194\x2dd5c8\x2d8794\x2d23bd46cbd8b9.mount: Deactivated successfully. Dec 16 12:13:00.141743 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3094842750.mount: Deactivated successfully. Dec 16 12:13:00.166909 containerd[1662]: time="2025-12-16T12:13:00.166834480Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:13:00.167899 containerd[1662]: time="2025-12-16T12:13:00.167835483Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=150930912" Dec 16 12:13:00.169083 containerd[1662]: time="2025-12-16T12:13:00.169041926Z" level=info msg="ImageCreate event name:\"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:13:00.171451 containerd[1662]: time="2025-12-16T12:13:00.171391693Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:13:00.171841 containerd[1662]: time="2025-12-16T12:13:00.171797574Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"150934424\" in 5.990543443s" Dec 16 12:13:00.171841 containerd[1662]: time="2025-12-16T12:13:00.171840014Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\"" Dec 16 12:13:00.185928 containerd[1662]: time="2025-12-16T12:13:00.185887813Z" level=info msg="CreateContainer within sandbox \"23101e0844872f00e9017a5c42b84400c103cd2ac51869df9e22ed63a084429f\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Dec 16 12:13:00.198499 containerd[1662]: time="2025-12-16T12:13:00.197862367Z" level=info msg="Container 1ea1a736909af72a322960586c3b6d6a6a5c42ca4c68193cafe67428cdb16ca2: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:13:00.207007 containerd[1662]: time="2025-12-16T12:13:00.206932592Z" level=info msg="CreateContainer within sandbox \"23101e0844872f00e9017a5c42b84400c103cd2ac51869df9e22ed63a084429f\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"1ea1a736909af72a322960586c3b6d6a6a5c42ca4c68193cafe67428cdb16ca2\"" Dec 16 12:13:00.207914 containerd[1662]: time="2025-12-16T12:13:00.207879714Z" level=info msg="StartContainer for \"1ea1a736909af72a322960586c3b6d6a6a5c42ca4c68193cafe67428cdb16ca2\"" Dec 16 12:13:00.209474 containerd[1662]: time="2025-12-16T12:13:00.209448359Z" level=info msg="connecting to shim 1ea1a736909af72a322960586c3b6d6a6a5c42ca4c68193cafe67428cdb16ca2" address="unix:///run/containerd/s/511a2f4a7b69e5631dafd6f38394ed92a975aae1789a368dee9c5c5b455a0e0e" protocol=ttrpc version=3 Dec 16 12:13:00.229018 systemd[1]: Started cri-containerd-1ea1a736909af72a322960586c3b6d6a6a5c42ca4c68193cafe67428cdb16ca2.scope - libcontainer container 1ea1a736909af72a322960586c3b6d6a6a5c42ca4c68193cafe67428cdb16ca2. Dec 16 12:13:00.302000 audit: BPF prog-id=172 op=LOAD Dec 16 12:13:00.307733 kernel: kauditd_printk_skb: 6 callbacks suppressed Dec 16 12:13:00.307801 kernel: audit: type=1334 audit(1765887180.302:575): prog-id=172 op=LOAD Dec 16 12:13:00.307837 kernel: audit: type=1300 audit(1765887180.302:575): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001923e8 a2=98 a3=0 items=0 ppid=3478 pid=4046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:00.307861 kernel: audit: type=1327 audit(1765887180.302:575): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165613161373336393039616637326133323239363035383663336236 Dec 16 12:13:00.302000 audit[4046]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001923e8 a2=98 a3=0 items=0 ppid=3478 pid=4046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:00.302000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165613161373336393039616637326133323239363035383663336236 Dec 16 12:13:00.303000 audit: BPF prog-id=173 op=LOAD Dec 16 12:13:00.311783 kernel: audit: type=1334 audit(1765887180.303:576): prog-id=173 op=LOAD Dec 16 12:13:00.311853 kernel: audit: type=1300 audit(1765887180.303:576): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000192168 a2=98 a3=0 items=0 ppid=3478 pid=4046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:00.303000 audit[4046]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000192168 a2=98 a3=0 items=0 ppid=3478 pid=4046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:00.303000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165613161373336393039616637326133323239363035383663336236 Dec 16 12:13:00.318442 kernel: audit: type=1327 audit(1765887180.303:576): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165613161373336393039616637326133323239363035383663336236 Dec 16 12:13:00.318514 kernel: audit: type=1334 audit(1765887180.303:577): prog-id=173 op=UNLOAD Dec 16 12:13:00.303000 audit: BPF prog-id=173 op=UNLOAD Dec 16 12:13:00.303000 audit[4046]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3478 pid=4046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:00.322678 kernel: audit: type=1300 audit(1765887180.303:577): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3478 pid=4046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:00.322803 kernel: audit: type=1327 audit(1765887180.303:577): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165613161373336393039616637326133323239363035383663336236 Dec 16 12:13:00.303000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165613161373336393039616637326133323239363035383663336236 Dec 16 12:13:00.303000 audit: BPF prog-id=172 op=UNLOAD Dec 16 12:13:00.326854 kernel: audit: type=1334 audit(1765887180.303:578): prog-id=172 op=UNLOAD Dec 16 12:13:00.303000 audit[4046]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3478 pid=4046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:00.303000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165613161373336393039616637326133323239363035383663336236 Dec 16 12:13:00.303000 audit: BPF prog-id=174 op=LOAD Dec 16 12:13:00.303000 audit[4046]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000192648 a2=98 a3=0 items=0 ppid=3478 pid=4046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:00.303000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3165613161373336393039616637326133323239363035383663336236 Dec 16 12:13:00.343916 containerd[1662]: time="2025-12-16T12:13:00.343865093Z" level=info msg="StartContainer for \"1ea1a736909af72a322960586c3b6d6a6a5c42ca4c68193cafe67428cdb16ca2\" returns successfully" Dec 16 12:13:00.487703 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Dec 16 12:13:00.487914 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Dec 16 12:13:00.662036 kubelet[2916]: I1216 12:13:00.661979 2916 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/908b1695-a850-4afd-b4ae-bc3924c46a24-whisker-ca-bundle\") pod \"908b1695-a850-4afd-b4ae-bc3924c46a24\" (UID: \"908b1695-a850-4afd-b4ae-bc3924c46a24\") " Dec 16 12:13:00.662036 kubelet[2916]: I1216 12:13:00.662039 2916 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rjg7b\" (UniqueName: \"kubernetes.io/projected/908b1695-a850-4afd-b4ae-bc3924c46a24-kube-api-access-rjg7b\") pod \"908b1695-a850-4afd-b4ae-bc3924c46a24\" (UID: \"908b1695-a850-4afd-b4ae-bc3924c46a24\") " Dec 16 12:13:00.662396 kubelet[2916]: I1216 12:13:00.662080 2916 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/908b1695-a850-4afd-b4ae-bc3924c46a24-whisker-backend-key-pair\") pod \"908b1695-a850-4afd-b4ae-bc3924c46a24\" (UID: \"908b1695-a850-4afd-b4ae-bc3924c46a24\") " Dec 16 12:13:00.663171 kubelet[2916]: I1216 12:13:00.662748 2916 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/908b1695-a850-4afd-b4ae-bc3924c46a24-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "908b1695-a850-4afd-b4ae-bc3924c46a24" (UID: "908b1695-a850-4afd-b4ae-bc3924c46a24"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Dec 16 12:13:00.665296 kubelet[2916]: I1216 12:13:00.665242 2916 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/908b1695-a850-4afd-b4ae-bc3924c46a24-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "908b1695-a850-4afd-b4ae-bc3924c46a24" (UID: "908b1695-a850-4afd-b4ae-bc3924c46a24"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Dec 16 12:13:00.666497 kubelet[2916]: I1216 12:13:00.666355 2916 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/908b1695-a850-4afd-b4ae-bc3924c46a24-kube-api-access-rjg7b" (OuterVolumeSpecName: "kube-api-access-rjg7b") pod "908b1695-a850-4afd-b4ae-bc3924c46a24" (UID: "908b1695-a850-4afd-b4ae-bc3924c46a24"). InnerVolumeSpecName "kube-api-access-rjg7b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Dec 16 12:13:00.763000 kubelet[2916]: I1216 12:13:00.762925 2916 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/908b1695-a850-4afd-b4ae-bc3924c46a24-whisker-backend-key-pair\") on node \"ci-4547-0-0-4-c6e23b3406\" DevicePath \"\"" Dec 16 12:13:00.763000 kubelet[2916]: I1216 12:13:00.762961 2916 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/908b1695-a850-4afd-b4ae-bc3924c46a24-whisker-ca-bundle\") on node \"ci-4547-0-0-4-c6e23b3406\" DevicePath \"\"" Dec 16 12:13:00.763000 kubelet[2916]: I1216 12:13:00.762973 2916 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rjg7b\" (UniqueName: \"kubernetes.io/projected/908b1695-a850-4afd-b4ae-bc3924c46a24-kube-api-access-rjg7b\") on node \"ci-4547-0-0-4-c6e23b3406\" DevicePath \"\"" Dec 16 12:13:01.070575 systemd[1]: Removed slice kubepods-besteffort-pod908b1695_a850_4afd_b4ae_bc3924c46a24.slice - libcontainer container kubepods-besteffort-pod908b1695_a850_4afd_b4ae_bc3924c46a24.slice. Dec 16 12:13:01.141249 systemd[1]: var-lib-kubelet-pods-908b1695\x2da850\x2d4afd\x2db4ae\x2dbc3924c46a24-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2drjg7b.mount: Deactivated successfully. Dec 16 12:13:01.141345 systemd[1]: var-lib-kubelet-pods-908b1695\x2da850\x2d4afd\x2db4ae\x2dbc3924c46a24-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Dec 16 12:13:01.502944 kubelet[2916]: I1216 12:13:01.502752 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-wcj42" podStartSLOduration=3.012767782 podStartE2EDuration="18.50266492s" podCreationTimestamp="2025-12-16 12:12:43 +0000 UTC" firstStartedPulling="2025-12-16 12:12:44.682794838 +0000 UTC m=+27.695106761" lastFinishedPulling="2025-12-16 12:13:00.172692016 +0000 UTC m=+43.185003899" observedRunningTime="2025-12-16 12:13:01.497035705 +0000 UTC m=+44.509347628" watchObservedRunningTime="2025-12-16 12:13:01.50266492 +0000 UTC m=+44.514976843" Dec 16 12:13:01.517298 systemd[1]: Created slice kubepods-besteffort-pod9d7a60ea_0d84_4a1a_8fbd_83649fe49cfb.slice - libcontainer container kubepods-besteffort-pod9d7a60ea_0d84_4a1a_8fbd_83649fe49cfb.slice. Dec 16 12:13:01.571534 kubelet[2916]: I1216 12:13:01.571425 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb-whisker-backend-key-pair\") pod \"whisker-5996544d9f-ldq7h\" (UID: \"9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb\") " pod="calico-system/whisker-5996544d9f-ldq7h" Dec 16 12:13:01.571739 kubelet[2916]: I1216 12:13:01.571547 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb-whisker-ca-bundle\") pod \"whisker-5996544d9f-ldq7h\" (UID: \"9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb\") " pod="calico-system/whisker-5996544d9f-ldq7h" Dec 16 12:13:01.571739 kubelet[2916]: I1216 12:13:01.571687 2916 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9842c\" (UniqueName: \"kubernetes.io/projected/9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb-kube-api-access-9842c\") pod \"whisker-5996544d9f-ldq7h\" (UID: \"9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb\") " pod="calico-system/whisker-5996544d9f-ldq7h" Dec 16 12:13:01.823131 containerd[1662]: time="2025-12-16T12:13:01.823029652Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5996544d9f-ldq7h,Uid:9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb,Namespace:calico-system,Attempt:0,}" Dec 16 12:13:01.982337 systemd-networkd[1584]: calia6ab1f283a6: Link UP Dec 16 12:13:01.982531 systemd-networkd[1584]: calia6ab1f283a6: Gained carrier Dec 16 12:13:01.995962 containerd[1662]: 2025-12-16 12:13:01.853 [INFO][4207] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Dec 16 12:13:01.995962 containerd[1662]: 2025-12-16 12:13:01.876 [INFO][4207] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--4--c6e23b3406-k8s-whisker--5996544d9f--ldq7h-eth0 whisker-5996544d9f- calico-system 9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb 888 0 2025-12-16 12:13:01 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5996544d9f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4547-0-0-4-c6e23b3406 whisker-5996544d9f-ldq7h eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] calia6ab1f283a6 [] [] }} ContainerID="0183a5951174f69b6db07d06d9bf0a5ba0ec0b6a3e3090ea4403232b2487c63f" Namespace="calico-system" Pod="whisker-5996544d9f-ldq7h" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-whisker--5996544d9f--ldq7h-" Dec 16 12:13:01.995962 containerd[1662]: 2025-12-16 12:13:01.876 [INFO][4207] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0183a5951174f69b6db07d06d9bf0a5ba0ec0b6a3e3090ea4403232b2487c63f" Namespace="calico-system" Pod="whisker-5996544d9f-ldq7h" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-whisker--5996544d9f--ldq7h-eth0" Dec 16 12:13:01.995962 containerd[1662]: 2025-12-16 12:13:01.933 [INFO][4227] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0183a5951174f69b6db07d06d9bf0a5ba0ec0b6a3e3090ea4403232b2487c63f" HandleID="k8s-pod-network.0183a5951174f69b6db07d06d9bf0a5ba0ec0b6a3e3090ea4403232b2487c63f" Workload="ci--4547--0--0--4--c6e23b3406-k8s-whisker--5996544d9f--ldq7h-eth0" Dec 16 12:13:01.996164 containerd[1662]: 2025-12-16 12:13:01.933 [INFO][4227] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="0183a5951174f69b6db07d06d9bf0a5ba0ec0b6a3e3090ea4403232b2487c63f" HandleID="k8s-pod-network.0183a5951174f69b6db07d06d9bf0a5ba0ec0b6a3e3090ea4403232b2487c63f" Workload="ci--4547--0--0--4--c6e23b3406-k8s-whisker--5996544d9f--ldq7h-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002c3600), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-4-c6e23b3406", "pod":"whisker-5996544d9f-ldq7h", "timestamp":"2025-12-16 12:13:01.933827921 +0000 UTC"}, Hostname:"ci-4547-0-0-4-c6e23b3406", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:13:01.996164 containerd[1662]: 2025-12-16 12:13:01.934 [INFO][4227] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:13:01.996164 containerd[1662]: 2025-12-16 12:13:01.934 [INFO][4227] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:13:01.996164 containerd[1662]: 2025-12-16 12:13:01.934 [INFO][4227] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-4-c6e23b3406' Dec 16 12:13:01.996164 containerd[1662]: 2025-12-16 12:13:01.944 [INFO][4227] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0183a5951174f69b6db07d06d9bf0a5ba0ec0b6a3e3090ea4403232b2487c63f" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:01.996164 containerd[1662]: 2025-12-16 12:13:01.949 [INFO][4227] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:01.996164 containerd[1662]: 2025-12-16 12:13:01.953 [INFO][4227] ipam/ipam.go 511: Trying affinity for 192.168.63.192/26 host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:01.996164 containerd[1662]: 2025-12-16 12:13:01.955 [INFO][4227] ipam/ipam.go 158: Attempting to load block cidr=192.168.63.192/26 host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:01.996164 containerd[1662]: 2025-12-16 12:13:01.960 [INFO][4227] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.63.192/26 host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:01.996394 containerd[1662]: 2025-12-16 12:13:01.960 [INFO][4227] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.63.192/26 handle="k8s-pod-network.0183a5951174f69b6db07d06d9bf0a5ba0ec0b6a3e3090ea4403232b2487c63f" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:01.996394 containerd[1662]: 2025-12-16 12:13:01.961 [INFO][4227] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.0183a5951174f69b6db07d06d9bf0a5ba0ec0b6a3e3090ea4403232b2487c63f Dec 16 12:13:01.996394 containerd[1662]: 2025-12-16 12:13:01.965 [INFO][4227] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.63.192/26 handle="k8s-pod-network.0183a5951174f69b6db07d06d9bf0a5ba0ec0b6a3e3090ea4403232b2487c63f" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:01.996394 containerd[1662]: 2025-12-16 12:13:01.971 [INFO][4227] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.63.193/26] block=192.168.63.192/26 handle="k8s-pod-network.0183a5951174f69b6db07d06d9bf0a5ba0ec0b6a3e3090ea4403232b2487c63f" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:01.996394 containerd[1662]: 2025-12-16 12:13:01.971 [INFO][4227] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.63.193/26] handle="k8s-pod-network.0183a5951174f69b6db07d06d9bf0a5ba0ec0b6a3e3090ea4403232b2487c63f" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:01.996394 containerd[1662]: 2025-12-16 12:13:01.971 [INFO][4227] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:13:01.996394 containerd[1662]: 2025-12-16 12:13:01.972 [INFO][4227] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.63.193/26] IPv6=[] ContainerID="0183a5951174f69b6db07d06d9bf0a5ba0ec0b6a3e3090ea4403232b2487c63f" HandleID="k8s-pod-network.0183a5951174f69b6db07d06d9bf0a5ba0ec0b6a3e3090ea4403232b2487c63f" Workload="ci--4547--0--0--4--c6e23b3406-k8s-whisker--5996544d9f--ldq7h-eth0" Dec 16 12:13:01.996538 containerd[1662]: 2025-12-16 12:13:01.974 [INFO][4207] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0183a5951174f69b6db07d06d9bf0a5ba0ec0b6a3e3090ea4403232b2487c63f" Namespace="calico-system" Pod="whisker-5996544d9f-ldq7h" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-whisker--5996544d9f--ldq7h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--4--c6e23b3406-k8s-whisker--5996544d9f--ldq7h-eth0", GenerateName:"whisker-5996544d9f-", Namespace:"calico-system", SelfLink:"", UID:"9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb", ResourceVersion:"888", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 13, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5996544d9f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-4-c6e23b3406", ContainerID:"", Pod:"whisker-5996544d9f-ldq7h", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.63.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calia6ab1f283a6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:13:01.996538 containerd[1662]: 2025-12-16 12:13:01.974 [INFO][4207] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.63.193/32] ContainerID="0183a5951174f69b6db07d06d9bf0a5ba0ec0b6a3e3090ea4403232b2487c63f" Namespace="calico-system" Pod="whisker-5996544d9f-ldq7h" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-whisker--5996544d9f--ldq7h-eth0" Dec 16 12:13:01.996612 containerd[1662]: 2025-12-16 12:13:01.974 [INFO][4207] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia6ab1f283a6 ContainerID="0183a5951174f69b6db07d06d9bf0a5ba0ec0b6a3e3090ea4403232b2487c63f" Namespace="calico-system" Pod="whisker-5996544d9f-ldq7h" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-whisker--5996544d9f--ldq7h-eth0" Dec 16 12:13:01.996612 containerd[1662]: 2025-12-16 12:13:01.983 [INFO][4207] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0183a5951174f69b6db07d06d9bf0a5ba0ec0b6a3e3090ea4403232b2487c63f" Namespace="calico-system" Pod="whisker-5996544d9f-ldq7h" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-whisker--5996544d9f--ldq7h-eth0" Dec 16 12:13:01.996675 containerd[1662]: 2025-12-16 12:13:01.983 [INFO][4207] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0183a5951174f69b6db07d06d9bf0a5ba0ec0b6a3e3090ea4403232b2487c63f" Namespace="calico-system" Pod="whisker-5996544d9f-ldq7h" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-whisker--5996544d9f--ldq7h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--4--c6e23b3406-k8s-whisker--5996544d9f--ldq7h-eth0", GenerateName:"whisker-5996544d9f-", Namespace:"calico-system", SelfLink:"", UID:"9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb", ResourceVersion:"888", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 13, 1, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5996544d9f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-4-c6e23b3406", ContainerID:"0183a5951174f69b6db07d06d9bf0a5ba0ec0b6a3e3090ea4403232b2487c63f", Pod:"whisker-5996544d9f-ldq7h", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.63.193/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"calia6ab1f283a6", MAC:"32:ac:88:e6:27:c4", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:13:01.996728 containerd[1662]: 2025-12-16 12:13:01.993 [INFO][4207] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0183a5951174f69b6db07d06d9bf0a5ba0ec0b6a3e3090ea4403232b2487c63f" Namespace="calico-system" Pod="whisker-5996544d9f-ldq7h" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-whisker--5996544d9f--ldq7h-eth0" Dec 16 12:13:02.016343 containerd[1662]: time="2025-12-16T12:13:02.016295191Z" level=info msg="connecting to shim 0183a5951174f69b6db07d06d9bf0a5ba0ec0b6a3e3090ea4403232b2487c63f" address="unix:///run/containerd/s/730e46a7d553c3c5f2a75cf2f9cf610de42b270cbb4523245e690f17ff2560c2" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:13:02.037825 systemd[1]: Started cri-containerd-0183a5951174f69b6db07d06d9bf0a5ba0ec0b6a3e3090ea4403232b2487c63f.scope - libcontainer container 0183a5951174f69b6db07d06d9bf0a5ba0ec0b6a3e3090ea4403232b2487c63f. Dec 16 12:13:02.047000 audit: BPF prog-id=175 op=LOAD Dec 16 12:13:02.047000 audit: BPF prog-id=176 op=LOAD Dec 16 12:13:02.047000 audit[4262]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4251 pid=4262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:02.047000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3031383361353935313137346636396236646230376430366439626630 Dec 16 12:13:02.047000 audit: BPF prog-id=176 op=UNLOAD Dec 16 12:13:02.047000 audit[4262]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4251 pid=4262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:02.047000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3031383361353935313137346636396236646230376430366439626630 Dec 16 12:13:02.047000 audit: BPF prog-id=177 op=LOAD Dec 16 12:13:02.047000 audit[4262]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4251 pid=4262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:02.047000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3031383361353935313137346636396236646230376430366439626630 Dec 16 12:13:02.047000 audit: BPF prog-id=178 op=LOAD Dec 16 12:13:02.047000 audit[4262]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4251 pid=4262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:02.047000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3031383361353935313137346636396236646230376430366439626630 Dec 16 12:13:02.047000 audit: BPF prog-id=178 op=UNLOAD Dec 16 12:13:02.047000 audit[4262]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4251 pid=4262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:02.047000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3031383361353935313137346636396236646230376430366439626630 Dec 16 12:13:02.047000 audit: BPF prog-id=177 op=UNLOAD Dec 16 12:13:02.047000 audit[4262]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4251 pid=4262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:02.047000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3031383361353935313137346636396236646230376430366439626630 Dec 16 12:13:02.047000 audit: BPF prog-id=179 op=LOAD Dec 16 12:13:02.047000 audit[4262]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4251 pid=4262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:02.047000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3031383361353935313137346636396236646230376430366439626630 Dec 16 12:13:02.070870 containerd[1662]: time="2025-12-16T12:13:02.070815663Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5996544d9f-ldq7h,Uid:9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb,Namespace:calico-system,Attempt:0,} returns sandbox id \"0183a5951174f69b6db07d06d9bf0a5ba0ec0b6a3e3090ea4403232b2487c63f\"" Dec 16 12:13:02.074468 containerd[1662]: time="2025-12-16T12:13:02.073595030Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:13:02.161323 kubelet[2916]: I1216 12:13:02.161216 2916 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 12:13:02.184000 audit[4289]: NETFILTER_CFG table=filter:123 family=2 entries=21 op=nft_register_rule pid=4289 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:13:02.184000 audit[4289]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffe27d8b50 a2=0 a3=1 items=0 ppid=3060 pid=4289 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:02.184000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:13:02.189000 audit[4289]: NETFILTER_CFG table=nat:124 family=2 entries=19 op=nft_register_chain pid=4289 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:13:02.189000 audit[4289]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6276 a0=3 a1=ffffe27d8b50 a2=0 a3=1 items=0 ppid=3060 pid=4289 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:02.189000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:13:02.199874 kubelet[2916]: I1216 12:13:02.199804 2916 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 12:13:02.439495 containerd[1662]: time="2025-12-16T12:13:02.439365809Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:13:02.443703 containerd[1662]: time="2025-12-16T12:13:02.443664741Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:13:02.443799 containerd[1662]: time="2025-12-16T12:13:02.443734981Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:13:02.444229 kubelet[2916]: E1216 12:13:02.443984 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:13:02.444229 kubelet[2916]: E1216 12:13:02.444040 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:13:02.444326 kubelet[2916]: E1216 12:13:02.444178 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:62693b2085a94b8ea373cfdd88d47738,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9842c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5996544d9f-ldq7h_calico-system(9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:13:02.446293 containerd[1662]: time="2025-12-16T12:13:02.446263868Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:13:02.776580 containerd[1662]: time="2025-12-16T12:13:02.776447148Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:13:02.778200 containerd[1662]: time="2025-12-16T12:13:02.778156352Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:13:02.778292 containerd[1662]: time="2025-12-16T12:13:02.778181712Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:13:02.778419 kubelet[2916]: E1216 12:13:02.778379 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:13:02.778462 kubelet[2916]: E1216 12:13:02.778428 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:13:02.778582 kubelet[2916]: E1216 12:13:02.778530 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9842c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5996544d9f-ldq7h_calico-system(9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:13:02.779964 kubelet[2916]: E1216 12:13:02.779915 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5996544d9f-ldq7h" podUID="9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb" Dec 16 12:13:03.036000 audit: BPF prog-id=180 op=LOAD Dec 16 12:13:03.036000 audit[4333]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffcca96758 a2=98 a3=ffffcca96748 items=0 ppid=4292 pid=4333 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.036000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:13:03.036000 audit: BPF prog-id=180 op=UNLOAD Dec 16 12:13:03.036000 audit[4333]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffcca96728 a3=0 items=0 ppid=4292 pid=4333 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.036000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:13:03.036000 audit: BPF prog-id=181 op=LOAD Dec 16 12:13:03.036000 audit[4333]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffcca96608 a2=74 a3=95 items=0 ppid=4292 pid=4333 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.036000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:13:03.036000 audit: BPF prog-id=181 op=UNLOAD Dec 16 12:13:03.036000 audit[4333]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4292 pid=4333 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.036000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:13:03.036000 audit: BPF prog-id=182 op=LOAD Dec 16 12:13:03.036000 audit[4333]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffcca96638 a2=40 a3=ffffcca96668 items=0 ppid=4292 pid=4333 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.036000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:13:03.036000 audit: BPF prog-id=182 op=UNLOAD Dec 16 12:13:03.036000 audit[4333]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffcca96668 items=0 ppid=4292 pid=4333 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.036000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:13:03.038000 audit: BPF prog-id=183 op=LOAD Dec 16 12:13:03.038000 audit[4335]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffc789c238 a2=98 a3=ffffc789c228 items=0 ppid=4292 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.038000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:13:03.038000 audit: BPF prog-id=183 op=UNLOAD Dec 16 12:13:03.038000 audit[4335]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffc789c208 a3=0 items=0 ppid=4292 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.038000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:13:03.038000 audit: BPF prog-id=184 op=LOAD Dec 16 12:13:03.038000 audit[4335]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffc789bec8 a2=74 a3=95 items=0 ppid=4292 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.038000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:13:03.038000 audit: BPF prog-id=184 op=UNLOAD Dec 16 12:13:03.038000 audit[4335]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4292 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.038000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:13:03.038000 audit: BPF prog-id=185 op=LOAD Dec 16 12:13:03.038000 audit[4335]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffc789bf28 a2=94 a3=2 items=0 ppid=4292 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.038000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:13:03.038000 audit: BPF prog-id=185 op=UNLOAD Dec 16 12:13:03.038000 audit[4335]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4292 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.038000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:13:03.066766 kubelet[2916]: I1216 12:13:03.066711 2916 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="908b1695-a850-4afd-b4ae-bc3924c46a24" path="/var/lib/kubelet/pods/908b1695-a850-4afd-b4ae-bc3924c46a24/volumes" Dec 16 12:13:03.143000 audit: BPF prog-id=186 op=LOAD Dec 16 12:13:03.143000 audit[4335]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffc789bee8 a2=40 a3=ffffc789bf18 items=0 ppid=4292 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.143000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:13:03.143000 audit: BPF prog-id=186 op=UNLOAD Dec 16 12:13:03.143000 audit[4335]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=ffffc789bf18 items=0 ppid=4292 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.143000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:13:03.152000 audit: BPF prog-id=187 op=LOAD Dec 16 12:13:03.152000 audit[4335]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffc789bef8 a2=94 a3=4 items=0 ppid=4292 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.152000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:13:03.152000 audit: BPF prog-id=187 op=UNLOAD Dec 16 12:13:03.152000 audit[4335]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4292 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.152000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:13:03.152000 audit: BPF prog-id=188 op=LOAD Dec 16 12:13:03.152000 audit[4335]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffc789bd38 a2=94 a3=5 items=0 ppid=4292 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.152000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:13:03.152000 audit: BPF prog-id=188 op=UNLOAD Dec 16 12:13:03.152000 audit[4335]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4292 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.152000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:13:03.152000 audit: BPF prog-id=189 op=LOAD Dec 16 12:13:03.152000 audit[4335]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffc789bf68 a2=94 a3=6 items=0 ppid=4292 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.152000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:13:03.152000 audit: BPF prog-id=189 op=UNLOAD Dec 16 12:13:03.152000 audit[4335]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4292 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.152000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:13:03.153000 audit: BPF prog-id=190 op=LOAD Dec 16 12:13:03.153000 audit[4335]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffc789b738 a2=94 a3=83 items=0 ppid=4292 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.153000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:13:03.153000 audit: BPF prog-id=191 op=LOAD Dec 16 12:13:03.153000 audit[4335]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=ffffc789b4f8 a2=94 a3=2 items=0 ppid=4292 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.153000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:13:03.153000 audit: BPF prog-id=191 op=UNLOAD Dec 16 12:13:03.153000 audit[4335]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4292 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.153000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:13:03.153000 audit: BPF prog-id=190 op=UNLOAD Dec 16 12:13:03.153000 audit[4335]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=1eb90620 a3=1eb83b00 items=0 ppid=4292 pid=4335 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.153000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:13:03.162000 audit: BPF prog-id=192 op=LOAD Dec 16 12:13:03.162000 audit[4353]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffdd9812b8 a2=98 a3=ffffdd9812a8 items=0 ppid=4292 pid=4353 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.162000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:13:03.163000 audit: BPF prog-id=192 op=UNLOAD Dec 16 12:13:03.163000 audit[4353]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffdd981288 a3=0 items=0 ppid=4292 pid=4353 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.163000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:13:03.163000 audit: BPF prog-id=193 op=LOAD Dec 16 12:13:03.163000 audit[4353]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffdd981168 a2=74 a3=95 items=0 ppid=4292 pid=4353 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.163000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:13:03.163000 audit: BPF prog-id=193 op=UNLOAD Dec 16 12:13:03.163000 audit[4353]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4292 pid=4353 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.163000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:13:03.163000 audit: BPF prog-id=194 op=LOAD Dec 16 12:13:03.163000 audit[4353]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffdd981198 a2=40 a3=ffffdd9811c8 items=0 ppid=4292 pid=4353 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.163000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:13:03.163000 audit: BPF prog-id=194 op=UNLOAD Dec 16 12:13:03.163000 audit[4353]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffdd9811c8 items=0 ppid=4292 pid=4353 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.163000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:13:03.206410 kubelet[2916]: E1216 12:13:03.206182 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5996544d9f-ldq7h" podUID="9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb" Dec 16 12:13:03.226000 audit[4369]: NETFILTER_CFG table=filter:125 family=2 entries=20 op=nft_register_rule pid=4369 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:13:03.226000 audit[4369]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=fffffbff5790 a2=0 a3=1 items=0 ppid=3060 pid=4369 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.226000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:13:03.230756 systemd-networkd[1584]: vxlan.calico: Link UP Dec 16 12:13:03.230767 systemd-networkd[1584]: vxlan.calico: Gained carrier Dec 16 12:13:03.236000 audit[4369]: NETFILTER_CFG table=nat:126 family=2 entries=14 op=nft_register_rule pid=4369 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:13:03.236000 audit[4369]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=fffffbff5790 a2=0 a3=1 items=0 ppid=3060 pid=4369 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.236000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:13:03.251000 audit: BPF prog-id=195 op=LOAD Dec 16 12:13:03.251000 audit[4380]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffecb40018 a2=98 a3=ffffecb40008 items=0 ppid=4292 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.251000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:13:03.251000 audit: BPF prog-id=195 op=UNLOAD Dec 16 12:13:03.251000 audit[4380]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffecb3ffe8 a3=0 items=0 ppid=4292 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.251000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:13:03.251000 audit: BPF prog-id=196 op=LOAD Dec 16 12:13:03.251000 audit[4380]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffecb3fcf8 a2=74 a3=95 items=0 ppid=4292 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.251000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:13:03.251000 audit: BPF prog-id=196 op=UNLOAD Dec 16 12:13:03.251000 audit[4380]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4292 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.251000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:13:03.251000 audit: BPF prog-id=197 op=LOAD Dec 16 12:13:03.251000 audit[4380]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffecb3fd58 a2=94 a3=2 items=0 ppid=4292 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.251000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:13:03.251000 audit: BPF prog-id=197 op=UNLOAD Dec 16 12:13:03.251000 audit[4380]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=70 a3=2 items=0 ppid=4292 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.251000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:13:03.251000 audit: BPF prog-id=198 op=LOAD Dec 16 12:13:03.251000 audit[4380]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffecb3fbd8 a2=40 a3=ffffecb3fc08 items=0 ppid=4292 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.251000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:13:03.251000 audit: BPF prog-id=198 op=UNLOAD Dec 16 12:13:03.251000 audit[4380]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=40 a3=ffffecb3fc08 items=0 ppid=4292 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.251000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:13:03.251000 audit: BPF prog-id=199 op=LOAD Dec 16 12:13:03.251000 audit[4380]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffecb3fd28 a2=94 a3=b7 items=0 ppid=4292 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.251000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:13:03.251000 audit: BPF prog-id=199 op=UNLOAD Dec 16 12:13:03.251000 audit[4380]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=b7 items=0 ppid=4292 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.251000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:13:03.252000 audit: BPF prog-id=200 op=LOAD Dec 16 12:13:03.252000 audit[4380]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffecb3f3d8 a2=94 a3=2 items=0 ppid=4292 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.252000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:13:03.252000 audit: BPF prog-id=200 op=UNLOAD Dec 16 12:13:03.252000 audit[4380]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=2 items=0 ppid=4292 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.252000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:13:03.252000 audit: BPF prog-id=201 op=LOAD Dec 16 12:13:03.252000 audit[4380]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffecb3f568 a2=94 a3=30 items=0 ppid=4292 pid=4380 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.252000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:13:03.261000 audit: BPF prog-id=202 op=LOAD Dec 16 12:13:03.261000 audit[4386]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffeb721d38 a2=98 a3=ffffeb721d28 items=0 ppid=4292 pid=4386 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.261000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:13:03.261000 audit: BPF prog-id=202 op=UNLOAD Dec 16 12:13:03.261000 audit[4386]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffeb721d08 a3=0 items=0 ppid=4292 pid=4386 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.261000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:13:03.261000 audit: BPF prog-id=203 op=LOAD Dec 16 12:13:03.261000 audit[4386]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffeb7219c8 a2=74 a3=95 items=0 ppid=4292 pid=4386 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.261000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:13:03.261000 audit: BPF prog-id=203 op=UNLOAD Dec 16 12:13:03.261000 audit[4386]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4292 pid=4386 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.261000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:13:03.261000 audit: BPF prog-id=204 op=LOAD Dec 16 12:13:03.261000 audit[4386]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffeb721a28 a2=94 a3=2 items=0 ppid=4292 pid=4386 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.261000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:13:03.261000 audit: BPF prog-id=204 op=UNLOAD Dec 16 12:13:03.261000 audit[4386]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4292 pid=4386 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.261000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:13:03.288824 systemd-networkd[1584]: calia6ab1f283a6: Gained IPv6LL Dec 16 12:13:03.362000 audit: BPF prog-id=205 op=LOAD Dec 16 12:13:03.362000 audit[4386]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffeb7219e8 a2=40 a3=ffffeb721a18 items=0 ppid=4292 pid=4386 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.362000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:13:03.362000 audit: BPF prog-id=205 op=UNLOAD Dec 16 12:13:03.362000 audit[4386]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=ffffeb721a18 items=0 ppid=4292 pid=4386 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.362000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:13:03.371000 audit: BPF prog-id=206 op=LOAD Dec 16 12:13:03.371000 audit[4386]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffeb7219f8 a2=94 a3=4 items=0 ppid=4292 pid=4386 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.371000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:13:03.372000 audit: BPF prog-id=206 op=UNLOAD Dec 16 12:13:03.372000 audit[4386]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4292 pid=4386 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.372000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:13:03.372000 audit: BPF prog-id=207 op=LOAD Dec 16 12:13:03.372000 audit[4386]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffeb721838 a2=94 a3=5 items=0 ppid=4292 pid=4386 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.372000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:13:03.372000 audit: BPF prog-id=207 op=UNLOAD Dec 16 12:13:03.372000 audit[4386]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4292 pid=4386 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.372000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:13:03.372000 audit: BPF prog-id=208 op=LOAD Dec 16 12:13:03.372000 audit[4386]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffeb721a68 a2=94 a3=6 items=0 ppid=4292 pid=4386 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.372000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:13:03.372000 audit: BPF prog-id=208 op=UNLOAD Dec 16 12:13:03.372000 audit[4386]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4292 pid=4386 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.372000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:13:03.372000 audit: BPF prog-id=209 op=LOAD Dec 16 12:13:03.372000 audit[4386]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffeb721238 a2=94 a3=83 items=0 ppid=4292 pid=4386 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.372000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:13:03.372000 audit: BPF prog-id=210 op=LOAD Dec 16 12:13:03.372000 audit[4386]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=ffffeb720ff8 a2=94 a3=2 items=0 ppid=4292 pid=4386 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.372000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:13:03.372000 audit: BPF prog-id=210 op=UNLOAD Dec 16 12:13:03.372000 audit[4386]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4292 pid=4386 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.372000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:13:03.373000 audit: BPF prog-id=209 op=UNLOAD Dec 16 12:13:03.373000 audit[4386]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=3b22d620 a3=3b220b00 items=0 ppid=4292 pid=4386 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.373000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:13:03.388000 audit: BPF prog-id=201 op=UNLOAD Dec 16 12:13:03.388000 audit[4292]: SYSCALL arch=c00000b7 syscall=35 success=yes exit=0 a0=ffffffffffffff9c a1=400120ebc0 a2=0 a3=0 items=0 ppid=4114 pid=4292 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.388000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Dec 16 12:13:03.432000 audit[4413]: NETFILTER_CFG table=nat:127 family=2 entries=15 op=nft_register_chain pid=4413 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:13:03.432000 audit[4413]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5084 a0=3 a1=ffffeddb1a00 a2=0 a3=ffff8a5d1fa8 items=0 ppid=4292 pid=4413 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.432000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:13:03.434000 audit[4416]: NETFILTER_CFG table=mangle:128 family=2 entries=16 op=nft_register_chain pid=4416 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:13:03.434000 audit[4416]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6868 a0=3 a1=fffffdaad330 a2=0 a3=ffffa7de2fa8 items=0 ppid=4292 pid=4416 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.434000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:13:03.450000 audit[4414]: NETFILTER_CFG table=raw:129 family=2 entries=21 op=nft_register_chain pid=4414 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:13:03.450000 audit[4414]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8452 a0=3 a1=ffffe2b7b220 a2=0 a3=ffffab9f4fa8 items=0 ppid=4292 pid=4414 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.450000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:13:03.451000 audit[4415]: NETFILTER_CFG table=filter:130 family=2 entries=94 op=nft_register_chain pid=4415 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:13:03.451000 audit[4415]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=53116 a0=3 a1=ffffe48a9df0 a2=0 a3=ffff83024fa8 items=0 ppid=4292 pid=4415 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:03.451000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:13:04.952770 systemd-networkd[1584]: vxlan.calico: Gained IPv6LL Dec 16 12:13:05.065285 containerd[1662]: time="2025-12-16T12:13:05.065118641Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-jp97r,Uid:78a6a3e9-dded-4049-98d3-1cf9591828bf,Namespace:kube-system,Attempt:0,}" Dec 16 12:13:05.065285 containerd[1662]: time="2025-12-16T12:13:05.065200722Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-tdcxp,Uid:7883990b-d1d6-4772-8df2-71ee5dd1032b,Namespace:kube-system,Attempt:0,}" Dec 16 12:13:05.210894 systemd-networkd[1584]: cali4bb7a4c22e7: Link UP Dec 16 12:13:05.211104 systemd-networkd[1584]: cali4bb7a4c22e7: Gained carrier Dec 16 12:13:05.223876 containerd[1662]: 2025-12-16 12:13:05.124 [INFO][4437] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--4--c6e23b3406-k8s-coredns--674b8bbfcf--tdcxp-eth0 coredns-674b8bbfcf- kube-system 7883990b-d1d6-4772-8df2-71ee5dd1032b 816 0 2025-12-16 12:12:24 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4547-0-0-4-c6e23b3406 coredns-674b8bbfcf-tdcxp eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali4bb7a4c22e7 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="31e33e065b905d83405e988186f0615078b9975b3150ea6fed3f5e5a281a65c0" Namespace="kube-system" Pod="coredns-674b8bbfcf-tdcxp" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-coredns--674b8bbfcf--tdcxp-" Dec 16 12:13:05.223876 containerd[1662]: 2025-12-16 12:13:05.124 [INFO][4437] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="31e33e065b905d83405e988186f0615078b9975b3150ea6fed3f5e5a281a65c0" Namespace="kube-system" Pod="coredns-674b8bbfcf-tdcxp" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-coredns--674b8bbfcf--tdcxp-eth0" Dec 16 12:13:05.223876 containerd[1662]: 2025-12-16 12:13:05.154 [INFO][4458] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="31e33e065b905d83405e988186f0615078b9975b3150ea6fed3f5e5a281a65c0" HandleID="k8s-pod-network.31e33e065b905d83405e988186f0615078b9975b3150ea6fed3f5e5a281a65c0" Workload="ci--4547--0--0--4--c6e23b3406-k8s-coredns--674b8bbfcf--tdcxp-eth0" Dec 16 12:13:05.224328 containerd[1662]: 2025-12-16 12:13:05.154 [INFO][4458] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="31e33e065b905d83405e988186f0615078b9975b3150ea6fed3f5e5a281a65c0" HandleID="k8s-pod-network.31e33e065b905d83405e988186f0615078b9975b3150ea6fed3f5e5a281a65c0" Workload="ci--4547--0--0--4--c6e23b3406-k8s-coredns--674b8bbfcf--tdcxp-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002e5590), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4547-0-0-4-c6e23b3406", "pod":"coredns-674b8bbfcf-tdcxp", "timestamp":"2025-12-16 12:13:05.15453181 +0000 UTC"}, Hostname:"ci-4547-0-0-4-c6e23b3406", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:13:05.224328 containerd[1662]: 2025-12-16 12:13:05.154 [INFO][4458] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:13:05.224328 containerd[1662]: 2025-12-16 12:13:05.154 [INFO][4458] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:13:05.224328 containerd[1662]: 2025-12-16 12:13:05.154 [INFO][4458] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-4-c6e23b3406' Dec 16 12:13:05.224328 containerd[1662]: 2025-12-16 12:13:05.167 [INFO][4458] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.31e33e065b905d83405e988186f0615078b9975b3150ea6fed3f5e5a281a65c0" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:05.224328 containerd[1662]: 2025-12-16 12:13:05.172 [INFO][4458] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:05.224328 containerd[1662]: 2025-12-16 12:13:05.180 [INFO][4458] ipam/ipam.go 511: Trying affinity for 192.168.63.192/26 host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:05.224328 containerd[1662]: 2025-12-16 12:13:05.182 [INFO][4458] ipam/ipam.go 158: Attempting to load block cidr=192.168.63.192/26 host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:05.224328 containerd[1662]: 2025-12-16 12:13:05.185 [INFO][4458] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.63.192/26 host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:05.224745 containerd[1662]: 2025-12-16 12:13:05.185 [INFO][4458] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.63.192/26 handle="k8s-pod-network.31e33e065b905d83405e988186f0615078b9975b3150ea6fed3f5e5a281a65c0" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:05.224745 containerd[1662]: 2025-12-16 12:13:05.189 [INFO][4458] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.31e33e065b905d83405e988186f0615078b9975b3150ea6fed3f5e5a281a65c0 Dec 16 12:13:05.224745 containerd[1662]: 2025-12-16 12:13:05.193 [INFO][4458] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.63.192/26 handle="k8s-pod-network.31e33e065b905d83405e988186f0615078b9975b3150ea6fed3f5e5a281a65c0" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:05.224745 containerd[1662]: 2025-12-16 12:13:05.200 [INFO][4458] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.63.194/26] block=192.168.63.192/26 handle="k8s-pod-network.31e33e065b905d83405e988186f0615078b9975b3150ea6fed3f5e5a281a65c0" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:05.224745 containerd[1662]: 2025-12-16 12:13:05.200 [INFO][4458] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.63.194/26] handle="k8s-pod-network.31e33e065b905d83405e988186f0615078b9975b3150ea6fed3f5e5a281a65c0" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:05.224745 containerd[1662]: 2025-12-16 12:13:05.200 [INFO][4458] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:13:05.224745 containerd[1662]: 2025-12-16 12:13:05.200 [INFO][4458] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.63.194/26] IPv6=[] ContainerID="31e33e065b905d83405e988186f0615078b9975b3150ea6fed3f5e5a281a65c0" HandleID="k8s-pod-network.31e33e065b905d83405e988186f0615078b9975b3150ea6fed3f5e5a281a65c0" Workload="ci--4547--0--0--4--c6e23b3406-k8s-coredns--674b8bbfcf--tdcxp-eth0" Dec 16 12:13:05.224875 containerd[1662]: 2025-12-16 12:13:05.203 [INFO][4437] cni-plugin/k8s.go 418: Populated endpoint ContainerID="31e33e065b905d83405e988186f0615078b9975b3150ea6fed3f5e5a281a65c0" Namespace="kube-system" Pod="coredns-674b8bbfcf-tdcxp" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-coredns--674b8bbfcf--tdcxp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--4--c6e23b3406-k8s-coredns--674b8bbfcf--tdcxp-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"7883990b-d1d6-4772-8df2-71ee5dd1032b", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 12, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-4-c6e23b3406", ContainerID:"", Pod:"coredns-674b8bbfcf-tdcxp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.63.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4bb7a4c22e7", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:13:05.224875 containerd[1662]: 2025-12-16 12:13:05.204 [INFO][4437] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.63.194/32] ContainerID="31e33e065b905d83405e988186f0615078b9975b3150ea6fed3f5e5a281a65c0" Namespace="kube-system" Pod="coredns-674b8bbfcf-tdcxp" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-coredns--674b8bbfcf--tdcxp-eth0" Dec 16 12:13:05.224875 containerd[1662]: 2025-12-16 12:13:05.204 [INFO][4437] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali4bb7a4c22e7 ContainerID="31e33e065b905d83405e988186f0615078b9975b3150ea6fed3f5e5a281a65c0" Namespace="kube-system" Pod="coredns-674b8bbfcf-tdcxp" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-coredns--674b8bbfcf--tdcxp-eth0" Dec 16 12:13:05.224875 containerd[1662]: 2025-12-16 12:13:05.211 [INFO][4437] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="31e33e065b905d83405e988186f0615078b9975b3150ea6fed3f5e5a281a65c0" Namespace="kube-system" Pod="coredns-674b8bbfcf-tdcxp" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-coredns--674b8bbfcf--tdcxp-eth0" Dec 16 12:13:05.224875 containerd[1662]: 2025-12-16 12:13:05.212 [INFO][4437] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="31e33e065b905d83405e988186f0615078b9975b3150ea6fed3f5e5a281a65c0" Namespace="kube-system" Pod="coredns-674b8bbfcf-tdcxp" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-coredns--674b8bbfcf--tdcxp-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--4--c6e23b3406-k8s-coredns--674b8bbfcf--tdcxp-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"7883990b-d1d6-4772-8df2-71ee5dd1032b", ResourceVersion:"816", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 12, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-4-c6e23b3406", ContainerID:"31e33e065b905d83405e988186f0615078b9975b3150ea6fed3f5e5a281a65c0", Pod:"coredns-674b8bbfcf-tdcxp", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.63.194/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali4bb7a4c22e7", MAC:"16:3f:b3:be:98:4a", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:13:05.224875 containerd[1662]: 2025-12-16 12:13:05.222 [INFO][4437] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="31e33e065b905d83405e988186f0615078b9975b3150ea6fed3f5e5a281a65c0" Namespace="kube-system" Pod="coredns-674b8bbfcf-tdcxp" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-coredns--674b8bbfcf--tdcxp-eth0" Dec 16 12:13:05.239000 audit[4484]: NETFILTER_CFG table=filter:131 family=2 entries=42 op=nft_register_chain pid=4484 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:13:05.239000 audit[4484]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=22552 a0=3 a1=ffffdde45250 a2=0 a3=ffff90fa6fa8 items=0 ppid=4292 pid=4484 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.239000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:13:05.251203 containerd[1662]: time="2025-12-16T12:13:05.251163279Z" level=info msg="connecting to shim 31e33e065b905d83405e988186f0615078b9975b3150ea6fed3f5e5a281a65c0" address="unix:///run/containerd/s/ed9d31477082022e111ddb9accbc16c32f1b7955f3e78afef3fcaccd9b63d248" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:13:05.274505 systemd[1]: Started cri-containerd-31e33e065b905d83405e988186f0615078b9975b3150ea6fed3f5e5a281a65c0.scope - libcontainer container 31e33e065b905d83405e988186f0615078b9975b3150ea6fed3f5e5a281a65c0. Dec 16 12:13:05.292000 audit: BPF prog-id=211 op=LOAD Dec 16 12:13:05.293000 audit: BPF prog-id=212 op=LOAD Dec 16 12:13:05.293000 audit[4504]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4494 pid=4504 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.293000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331653333653036356239303564383334303565393838313836663036 Dec 16 12:13:05.293000 audit: BPF prog-id=212 op=UNLOAD Dec 16 12:13:05.293000 audit[4504]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4494 pid=4504 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.293000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331653333653036356239303564383334303565393838313836663036 Dec 16 12:13:05.293000 audit: BPF prog-id=213 op=LOAD Dec 16 12:13:05.293000 audit[4504]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4494 pid=4504 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.293000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331653333653036356239303564383334303565393838313836663036 Dec 16 12:13:05.293000 audit: BPF prog-id=214 op=LOAD Dec 16 12:13:05.293000 audit[4504]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4494 pid=4504 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.293000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331653333653036356239303564383334303565393838313836663036 Dec 16 12:13:05.293000 audit: BPF prog-id=214 op=UNLOAD Dec 16 12:13:05.293000 audit[4504]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4494 pid=4504 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.293000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331653333653036356239303564383334303565393838313836663036 Dec 16 12:13:05.293000 audit: BPF prog-id=213 op=UNLOAD Dec 16 12:13:05.293000 audit[4504]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4494 pid=4504 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.293000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331653333653036356239303564383334303565393838313836663036 Dec 16 12:13:05.293000 audit: BPF prog-id=215 op=LOAD Dec 16 12:13:05.293000 audit[4504]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4494 pid=4504 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.293000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3331653333653036356239303564383334303565393838313836663036 Dec 16 12:13:05.309741 systemd-networkd[1584]: cali7f3c962e3ff: Link UP Dec 16 12:13:05.310473 systemd-networkd[1584]: cali7f3c962e3ff: Gained carrier Dec 16 12:13:05.333007 containerd[1662]: 2025-12-16 12:13:05.144 [INFO][4429] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--4--c6e23b3406-k8s-coredns--674b8bbfcf--jp97r-eth0 coredns-674b8bbfcf- kube-system 78a6a3e9-dded-4049-98d3-1cf9591828bf 813 0 2025-12-16 12:12:24 +0000 UTC map[k8s-app:kube-dns pod-template-hash:674b8bbfcf projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4547-0-0-4-c6e23b3406 coredns-674b8bbfcf-jp97r eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali7f3c962e3ff [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 }] [] }} ContainerID="0028a0231af27f3e0091202f5854256142e91a79bdc39e08d8e51e4bfd74623d" Namespace="kube-system" Pod="coredns-674b8bbfcf-jp97r" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-coredns--674b8bbfcf--jp97r-" Dec 16 12:13:05.333007 containerd[1662]: 2025-12-16 12:13:05.145 [INFO][4429] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0028a0231af27f3e0091202f5854256142e91a79bdc39e08d8e51e4bfd74623d" Namespace="kube-system" Pod="coredns-674b8bbfcf-jp97r" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-coredns--674b8bbfcf--jp97r-eth0" Dec 16 12:13:05.333007 containerd[1662]: 2025-12-16 12:13:05.179 [INFO][4466] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0028a0231af27f3e0091202f5854256142e91a79bdc39e08d8e51e4bfd74623d" HandleID="k8s-pod-network.0028a0231af27f3e0091202f5854256142e91a79bdc39e08d8e51e4bfd74623d" Workload="ci--4547--0--0--4--c6e23b3406-k8s-coredns--674b8bbfcf--jp97r-eth0" Dec 16 12:13:05.333007 containerd[1662]: 2025-12-16 12:13:05.179 [INFO][4466] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="0028a0231af27f3e0091202f5854256142e91a79bdc39e08d8e51e4bfd74623d" HandleID="k8s-pod-network.0028a0231af27f3e0091202f5854256142e91a79bdc39e08d8e51e4bfd74623d" Workload="ci--4547--0--0--4--c6e23b3406-k8s-coredns--674b8bbfcf--jp97r-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000137600), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4547-0-0-4-c6e23b3406", "pod":"coredns-674b8bbfcf-jp97r", "timestamp":"2025-12-16 12:13:05.179114199 +0000 UTC"}, Hostname:"ci-4547-0-0-4-c6e23b3406", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:13:05.333007 containerd[1662]: 2025-12-16 12:13:05.179 [INFO][4466] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:13:05.333007 containerd[1662]: 2025-12-16 12:13:05.200 [INFO][4466] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:13:05.333007 containerd[1662]: 2025-12-16 12:13:05.201 [INFO][4466] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-4-c6e23b3406' Dec 16 12:13:05.333007 containerd[1662]: 2025-12-16 12:13:05.266 [INFO][4466] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0028a0231af27f3e0091202f5854256142e91a79bdc39e08d8e51e4bfd74623d" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:05.333007 containerd[1662]: 2025-12-16 12:13:05.278 [INFO][4466] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:05.333007 containerd[1662]: 2025-12-16 12:13:05.285 [INFO][4466] ipam/ipam.go 511: Trying affinity for 192.168.63.192/26 host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:05.333007 containerd[1662]: 2025-12-16 12:13:05.287 [INFO][4466] ipam/ipam.go 158: Attempting to load block cidr=192.168.63.192/26 host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:05.333007 containerd[1662]: 2025-12-16 12:13:05.289 [INFO][4466] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.63.192/26 host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:05.333007 containerd[1662]: 2025-12-16 12:13:05.290 [INFO][4466] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.63.192/26 handle="k8s-pod-network.0028a0231af27f3e0091202f5854256142e91a79bdc39e08d8e51e4bfd74623d" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:05.333007 containerd[1662]: 2025-12-16 12:13:05.292 [INFO][4466] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.0028a0231af27f3e0091202f5854256142e91a79bdc39e08d8e51e4bfd74623d Dec 16 12:13:05.333007 containerd[1662]: 2025-12-16 12:13:05.296 [INFO][4466] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.63.192/26 handle="k8s-pod-network.0028a0231af27f3e0091202f5854256142e91a79bdc39e08d8e51e4bfd74623d" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:05.333007 containerd[1662]: 2025-12-16 12:13:05.303 [INFO][4466] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.63.195/26] block=192.168.63.192/26 handle="k8s-pod-network.0028a0231af27f3e0091202f5854256142e91a79bdc39e08d8e51e4bfd74623d" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:05.333007 containerd[1662]: 2025-12-16 12:13:05.303 [INFO][4466] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.63.195/26] handle="k8s-pod-network.0028a0231af27f3e0091202f5854256142e91a79bdc39e08d8e51e4bfd74623d" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:05.333007 containerd[1662]: 2025-12-16 12:13:05.304 [INFO][4466] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:13:05.333007 containerd[1662]: 2025-12-16 12:13:05.304 [INFO][4466] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.63.195/26] IPv6=[] ContainerID="0028a0231af27f3e0091202f5854256142e91a79bdc39e08d8e51e4bfd74623d" HandleID="k8s-pod-network.0028a0231af27f3e0091202f5854256142e91a79bdc39e08d8e51e4bfd74623d" Workload="ci--4547--0--0--4--c6e23b3406-k8s-coredns--674b8bbfcf--jp97r-eth0" Dec 16 12:13:05.333893 containerd[1662]: 2025-12-16 12:13:05.306 [INFO][4429] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0028a0231af27f3e0091202f5854256142e91a79bdc39e08d8e51e4bfd74623d" Namespace="kube-system" Pod="coredns-674b8bbfcf-jp97r" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-coredns--674b8bbfcf--jp97r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--4--c6e23b3406-k8s-coredns--674b8bbfcf--jp97r-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"78a6a3e9-dded-4049-98d3-1cf9591828bf", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 12, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-4-c6e23b3406", ContainerID:"", Pod:"coredns-674b8bbfcf-jp97r", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.63.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7f3c962e3ff", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:13:05.333893 containerd[1662]: 2025-12-16 12:13:05.306 [INFO][4429] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.63.195/32] ContainerID="0028a0231af27f3e0091202f5854256142e91a79bdc39e08d8e51e4bfd74623d" Namespace="kube-system" Pod="coredns-674b8bbfcf-jp97r" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-coredns--674b8bbfcf--jp97r-eth0" Dec 16 12:13:05.333893 containerd[1662]: 2025-12-16 12:13:05.306 [INFO][4429] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7f3c962e3ff ContainerID="0028a0231af27f3e0091202f5854256142e91a79bdc39e08d8e51e4bfd74623d" Namespace="kube-system" Pod="coredns-674b8bbfcf-jp97r" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-coredns--674b8bbfcf--jp97r-eth0" Dec 16 12:13:05.333893 containerd[1662]: 2025-12-16 12:13:05.311 [INFO][4429] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0028a0231af27f3e0091202f5854256142e91a79bdc39e08d8e51e4bfd74623d" Namespace="kube-system" Pod="coredns-674b8bbfcf-jp97r" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-coredns--674b8bbfcf--jp97r-eth0" Dec 16 12:13:05.333893 containerd[1662]: 2025-12-16 12:13:05.313 [INFO][4429] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0028a0231af27f3e0091202f5854256142e91a79bdc39e08d8e51e4bfd74623d" Namespace="kube-system" Pod="coredns-674b8bbfcf-jp97r" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-coredns--674b8bbfcf--jp97r-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--4--c6e23b3406-k8s-coredns--674b8bbfcf--jp97r-eth0", GenerateName:"coredns-674b8bbfcf-", Namespace:"kube-system", SelfLink:"", UID:"78a6a3e9-dded-4049-98d3-1cf9591828bf", ResourceVersion:"813", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 12, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"674b8bbfcf", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-4-c6e23b3406", ContainerID:"0028a0231af27f3e0091202f5854256142e91a79bdc39e08d8e51e4bfd74623d", Pod:"coredns-674b8bbfcf-jp97r", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.63.195/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali7f3c962e3ff", MAC:"2a:c0:45:62:89:c8", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:13:05.333893 containerd[1662]: 2025-12-16 12:13:05.325 [INFO][4429] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0028a0231af27f3e0091202f5854256142e91a79bdc39e08d8e51e4bfd74623d" Namespace="kube-system" Pod="coredns-674b8bbfcf-jp97r" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-coredns--674b8bbfcf--jp97r-eth0" Dec 16 12:13:05.337298 containerd[1662]: time="2025-12-16T12:13:05.337220079Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-tdcxp,Uid:7883990b-d1d6-4772-8df2-71ee5dd1032b,Namespace:kube-system,Attempt:0,} returns sandbox id \"31e33e065b905d83405e988186f0615078b9975b3150ea6fed3f5e5a281a65c0\"" Dec 16 12:13:05.348248 containerd[1662]: time="2025-12-16T12:13:05.347621628Z" level=info msg="CreateContainer within sandbox \"31e33e065b905d83405e988186f0615078b9975b3150ea6fed3f5e5a281a65c0\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 16 12:13:05.358597 kernel: kauditd_printk_skb: 262 callbacks suppressed Dec 16 12:13:05.358690 kernel: audit: type=1325 audit(1765887185.355:667): table=filter:132 family=2 entries=36 op=nft_register_chain pid=4541 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:13:05.358731 kernel: audit: type=1300 audit(1765887185.355:667): arch=c00000b7 syscall=211 success=yes exit=19156 a0=3 a1=ffffd0894050 a2=0 a3=ffffa8782fa8 items=0 ppid=4292 pid=4541 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.355000 audit[4541]: NETFILTER_CFG table=filter:132 family=2 entries=36 op=nft_register_chain pid=4541 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:13:05.355000 audit[4541]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=19156 a0=3 a1=ffffd0894050 a2=0 a3=ffffa8782fa8 items=0 ppid=4292 pid=4541 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.355000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:13:05.367652 kernel: audit: type=1327 audit(1765887185.355:667): proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:13:05.369426 containerd[1662]: time="2025-12-16T12:13:05.369342369Z" level=info msg="connecting to shim 0028a0231af27f3e0091202f5854256142e91a79bdc39e08d8e51e4bfd74623d" address="unix:///run/containerd/s/ddaa10f41327bd4e132b0f759629cf2eca36f13fb431274a6c9a5c2d4151fb4f" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:13:05.369830 containerd[1662]: time="2025-12-16T12:13:05.369546209Z" level=info msg="Container d7537fc49a0244278d2ab1b44da795e76c0f809e156c9b258ac176e9cfad37ff: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:13:05.385359 containerd[1662]: time="2025-12-16T12:13:05.385320573Z" level=info msg="CreateContainer within sandbox \"31e33e065b905d83405e988186f0615078b9975b3150ea6fed3f5e5a281a65c0\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"d7537fc49a0244278d2ab1b44da795e76c0f809e156c9b258ac176e9cfad37ff\"" Dec 16 12:13:05.386601 containerd[1662]: time="2025-12-16T12:13:05.386237096Z" level=info msg="StartContainer for \"d7537fc49a0244278d2ab1b44da795e76c0f809e156c9b258ac176e9cfad37ff\"" Dec 16 12:13:05.389523 containerd[1662]: time="2025-12-16T12:13:05.389410504Z" level=info msg="connecting to shim d7537fc49a0244278d2ab1b44da795e76c0f809e156c9b258ac176e9cfad37ff" address="unix:///run/containerd/s/ed9d31477082022e111ddb9accbc16c32f1b7955f3e78afef3fcaccd9b63d248" protocol=ttrpc version=3 Dec 16 12:13:05.396895 systemd[1]: Started cri-containerd-0028a0231af27f3e0091202f5854256142e91a79bdc39e08d8e51e4bfd74623d.scope - libcontainer container 0028a0231af27f3e0091202f5854256142e91a79bdc39e08d8e51e4bfd74623d. Dec 16 12:13:05.416909 systemd[1]: Started cri-containerd-d7537fc49a0244278d2ab1b44da795e76c0f809e156c9b258ac176e9cfad37ff.scope - libcontainer container d7537fc49a0244278d2ab1b44da795e76c0f809e156c9b258ac176e9cfad37ff. Dec 16 12:13:05.419000 audit: BPF prog-id=216 op=LOAD Dec 16 12:13:05.420000 audit: BPF prog-id=217 op=LOAD Dec 16 12:13:05.420000 audit[4562]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4550 pid=4562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.426056 kernel: audit: type=1334 audit(1765887185.419:668): prog-id=216 op=LOAD Dec 16 12:13:05.426235 kernel: audit: type=1334 audit(1765887185.420:669): prog-id=217 op=LOAD Dec 16 12:13:05.426267 kernel: audit: type=1300 audit(1765887185.420:669): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4550 pid=4562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.426282 kernel: audit: type=1327 audit(1765887185.420:669): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030323861303233316166323766336530303931323032663538353432 Dec 16 12:13:05.420000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030323861303233316166323766336530303931323032663538353432 Dec 16 12:13:05.420000 audit: BPF prog-id=217 op=UNLOAD Dec 16 12:13:05.430828 kernel: audit: type=1334 audit(1765887185.420:670): prog-id=217 op=UNLOAD Dec 16 12:13:05.430872 kernel: audit: type=1300 audit(1765887185.420:670): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4550 pid=4562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.420000 audit[4562]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4550 pid=4562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.420000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030323861303233316166323766336530303931323032663538353432 Dec 16 12:13:05.438466 kernel: audit: type=1327 audit(1765887185.420:670): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030323861303233316166323766336530303931323032663538353432 Dec 16 12:13:05.422000 audit: BPF prog-id=218 op=LOAD Dec 16 12:13:05.422000 audit[4562]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4550 pid=4562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.422000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030323861303233316166323766336530303931323032663538353432 Dec 16 12:13:05.425000 audit: BPF prog-id=219 op=LOAD Dec 16 12:13:05.425000 audit[4562]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4550 pid=4562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.425000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030323861303233316166323766336530303931323032663538353432 Dec 16 12:13:05.425000 audit: BPF prog-id=219 op=UNLOAD Dec 16 12:13:05.425000 audit[4562]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4550 pid=4562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.425000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030323861303233316166323766336530303931323032663538353432 Dec 16 12:13:05.425000 audit: BPF prog-id=218 op=UNLOAD Dec 16 12:13:05.425000 audit[4562]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4550 pid=4562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.425000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030323861303233316166323766336530303931323032663538353432 Dec 16 12:13:05.425000 audit: BPF prog-id=220 op=LOAD Dec 16 12:13:05.425000 audit[4562]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4550 pid=4562 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.425000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3030323861303233316166323766336530303931323032663538353432 Dec 16 12:13:05.439000 audit: BPF prog-id=221 op=LOAD Dec 16 12:13:05.440000 audit: BPF prog-id=222 op=LOAD Dec 16 12:13:05.440000 audit[4574]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=4494 pid=4574 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.440000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437353337666334396130323434323738643261623162343464613739 Dec 16 12:13:05.440000 audit: BPF prog-id=222 op=UNLOAD Dec 16 12:13:05.440000 audit[4574]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4494 pid=4574 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.440000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437353337666334396130323434323738643261623162343464613739 Dec 16 12:13:05.440000 audit: BPF prog-id=223 op=LOAD Dec 16 12:13:05.440000 audit[4574]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=4494 pid=4574 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.440000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437353337666334396130323434323738643261623162343464613739 Dec 16 12:13:05.440000 audit: BPF prog-id=224 op=LOAD Dec 16 12:13:05.440000 audit[4574]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=4494 pid=4574 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.440000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437353337666334396130323434323738643261623162343464613739 Dec 16 12:13:05.440000 audit: BPF prog-id=224 op=UNLOAD Dec 16 12:13:05.440000 audit[4574]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4494 pid=4574 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.440000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437353337666334396130323434323738643261623162343464613739 Dec 16 12:13:05.440000 audit: BPF prog-id=223 op=UNLOAD Dec 16 12:13:05.440000 audit[4574]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4494 pid=4574 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.440000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437353337666334396130323434323738643261623162343464613739 Dec 16 12:13:05.440000 audit: BPF prog-id=225 op=LOAD Dec 16 12:13:05.440000 audit[4574]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=4494 pid=4574 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.440000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6437353337666334396130323434323738643261623162343464613739 Dec 16 12:13:05.462734 containerd[1662]: time="2025-12-16T12:13:05.461711666Z" level=info msg="StartContainer for \"d7537fc49a0244278d2ab1b44da795e76c0f809e156c9b258ac176e9cfad37ff\" returns successfully" Dec 16 12:13:05.469511 containerd[1662]: time="2025-12-16T12:13:05.469474367Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-674b8bbfcf-jp97r,Uid:78a6a3e9-dded-4049-98d3-1cf9591828bf,Namespace:kube-system,Attempt:0,} returns sandbox id \"0028a0231af27f3e0091202f5854256142e91a79bdc39e08d8e51e4bfd74623d\"" Dec 16 12:13:05.476442 containerd[1662]: time="2025-12-16T12:13:05.475898145Z" level=info msg="CreateContainer within sandbox \"0028a0231af27f3e0091202f5854256142e91a79bdc39e08d8e51e4bfd74623d\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 16 12:13:05.488057 containerd[1662]: time="2025-12-16T12:13:05.488003659Z" level=info msg="Container e18f3b2010ee2558bbb9e9057cc289216d400e4d50fa58fab6c1eb65e017718c: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:13:05.497350 containerd[1662]: time="2025-12-16T12:13:05.497194325Z" level=info msg="CreateContainer within sandbox \"0028a0231af27f3e0091202f5854256142e91a79bdc39e08d8e51e4bfd74623d\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e18f3b2010ee2558bbb9e9057cc289216d400e4d50fa58fab6c1eb65e017718c\"" Dec 16 12:13:05.498400 containerd[1662]: time="2025-12-16T12:13:05.498368728Z" level=info msg="StartContainer for \"e18f3b2010ee2558bbb9e9057cc289216d400e4d50fa58fab6c1eb65e017718c\"" Dec 16 12:13:05.499252 containerd[1662]: time="2025-12-16T12:13:05.499227810Z" level=info msg="connecting to shim e18f3b2010ee2558bbb9e9057cc289216d400e4d50fa58fab6c1eb65e017718c" address="unix:///run/containerd/s/ddaa10f41327bd4e132b0f759629cf2eca36f13fb431274a6c9a5c2d4151fb4f" protocol=ttrpc version=3 Dec 16 12:13:05.519195 systemd[1]: Started cri-containerd-e18f3b2010ee2558bbb9e9057cc289216d400e4d50fa58fab6c1eb65e017718c.scope - libcontainer container e18f3b2010ee2558bbb9e9057cc289216d400e4d50fa58fab6c1eb65e017718c. Dec 16 12:13:05.532000 audit: BPF prog-id=226 op=LOAD Dec 16 12:13:05.532000 audit: BPF prog-id=227 op=LOAD Dec 16 12:13:05.532000 audit[4619]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0180 a2=98 a3=0 items=0 ppid=4550 pid=4619 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.532000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531386633623230313065653235353862626239653930353763633238 Dec 16 12:13:05.532000 audit: BPF prog-id=227 op=UNLOAD Dec 16 12:13:05.532000 audit[4619]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4550 pid=4619 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.532000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531386633623230313065653235353862626239653930353763633238 Dec 16 12:13:05.532000 audit: BPF prog-id=228 op=LOAD Dec 16 12:13:05.532000 audit[4619]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b03e8 a2=98 a3=0 items=0 ppid=4550 pid=4619 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.532000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531386633623230313065653235353862626239653930353763633238 Dec 16 12:13:05.532000 audit: BPF prog-id=229 op=LOAD Dec 16 12:13:05.532000 audit[4619]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001b0168 a2=98 a3=0 items=0 ppid=4550 pid=4619 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.532000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531386633623230313065653235353862626239653930353763633238 Dec 16 12:13:05.532000 audit: BPF prog-id=229 op=UNLOAD Dec 16 12:13:05.532000 audit[4619]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4550 pid=4619 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.532000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531386633623230313065653235353862626239653930353763633238 Dec 16 12:13:05.532000 audit: BPF prog-id=228 op=UNLOAD Dec 16 12:13:05.532000 audit[4619]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4550 pid=4619 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.532000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531386633623230313065653235353862626239653930353763633238 Dec 16 12:13:05.533000 audit: BPF prog-id=230 op=LOAD Dec 16 12:13:05.533000 audit[4619]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0648 a2=98 a3=0 items=0 ppid=4550 pid=4619 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:05.533000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531386633623230313065653235353862626239653930353763633238 Dec 16 12:13:05.551019 containerd[1662]: time="2025-12-16T12:13:05.550972514Z" level=info msg="StartContainer for \"e18f3b2010ee2558bbb9e9057cc289216d400e4d50fa58fab6c1eb65e017718c\" returns successfully" Dec 16 12:13:06.065039 containerd[1662]: time="2025-12-16T12:13:06.064986626Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5d8494b5b4-8w8m7,Uid:9aa83703-a98a-4b61-a3ff-9d2f2ed50abd,Namespace:calico-system,Attempt:0,}" Dec 16 12:13:06.065624 containerd[1662]: time="2025-12-16T12:13:06.065598908Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-585b8759c9-ggqjc,Uid:e4dab5b6-c490-4927-98c0-45033692e43c,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:13:06.182188 systemd-networkd[1584]: cali168afa3f260: Link UP Dec 16 12:13:06.184275 systemd-networkd[1584]: cali168afa3f260: Gained carrier Dec 16 12:13:06.198941 containerd[1662]: 2025-12-16 12:13:06.112 [INFO][4654] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--4--c6e23b3406-k8s-calico--kube--controllers--5d8494b5b4--8w8m7-eth0 calico-kube-controllers-5d8494b5b4- calico-system 9aa83703-a98a-4b61-a3ff-9d2f2ed50abd 819 0 2025-12-16 12:12:44 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:5d8494b5b4 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4547-0-0-4-c6e23b3406 calico-kube-controllers-5d8494b5b4-8w8m7 eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali168afa3f260 [] [] }} ContainerID="331dbf82555dd53f9a52431d48153f318314cae367f2a9de274dd168ee7c056a" Namespace="calico-system" Pod="calico-kube-controllers-5d8494b5b4-8w8m7" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-calico--kube--controllers--5d8494b5b4--8w8m7-" Dec 16 12:13:06.198941 containerd[1662]: 2025-12-16 12:13:06.112 [INFO][4654] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="331dbf82555dd53f9a52431d48153f318314cae367f2a9de274dd168ee7c056a" Namespace="calico-system" Pod="calico-kube-controllers-5d8494b5b4-8w8m7" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-calico--kube--controllers--5d8494b5b4--8w8m7-eth0" Dec 16 12:13:06.198941 containerd[1662]: 2025-12-16 12:13:06.141 [INFO][4682] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="331dbf82555dd53f9a52431d48153f318314cae367f2a9de274dd168ee7c056a" HandleID="k8s-pod-network.331dbf82555dd53f9a52431d48153f318314cae367f2a9de274dd168ee7c056a" Workload="ci--4547--0--0--4--c6e23b3406-k8s-calico--kube--controllers--5d8494b5b4--8w8m7-eth0" Dec 16 12:13:06.198941 containerd[1662]: 2025-12-16 12:13:06.141 [INFO][4682] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="331dbf82555dd53f9a52431d48153f318314cae367f2a9de274dd168ee7c056a" HandleID="k8s-pod-network.331dbf82555dd53f9a52431d48153f318314cae367f2a9de274dd168ee7c056a" Workload="ci--4547--0--0--4--c6e23b3406-k8s-calico--kube--controllers--5d8494b5b4--8w8m7-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000117b20), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-4-c6e23b3406", "pod":"calico-kube-controllers-5d8494b5b4-8w8m7", "timestamp":"2025-12-16 12:13:06.14177744 +0000 UTC"}, Hostname:"ci-4547-0-0-4-c6e23b3406", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:13:06.198941 containerd[1662]: 2025-12-16 12:13:06.142 [INFO][4682] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:13:06.198941 containerd[1662]: 2025-12-16 12:13:06.142 [INFO][4682] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:13:06.198941 containerd[1662]: 2025-12-16 12:13:06.142 [INFO][4682] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-4-c6e23b3406' Dec 16 12:13:06.198941 containerd[1662]: 2025-12-16 12:13:06.152 [INFO][4682] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.331dbf82555dd53f9a52431d48153f318314cae367f2a9de274dd168ee7c056a" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:06.198941 containerd[1662]: 2025-12-16 12:13:06.156 [INFO][4682] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:06.198941 containerd[1662]: 2025-12-16 12:13:06.161 [INFO][4682] ipam/ipam.go 511: Trying affinity for 192.168.63.192/26 host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:06.198941 containerd[1662]: 2025-12-16 12:13:06.163 [INFO][4682] ipam/ipam.go 158: Attempting to load block cidr=192.168.63.192/26 host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:06.198941 containerd[1662]: 2025-12-16 12:13:06.165 [INFO][4682] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.63.192/26 host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:06.198941 containerd[1662]: 2025-12-16 12:13:06.165 [INFO][4682] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.63.192/26 handle="k8s-pod-network.331dbf82555dd53f9a52431d48153f318314cae367f2a9de274dd168ee7c056a" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:06.198941 containerd[1662]: 2025-12-16 12:13:06.167 [INFO][4682] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.331dbf82555dd53f9a52431d48153f318314cae367f2a9de274dd168ee7c056a Dec 16 12:13:06.198941 containerd[1662]: 2025-12-16 12:13:06.171 [INFO][4682] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.63.192/26 handle="k8s-pod-network.331dbf82555dd53f9a52431d48153f318314cae367f2a9de274dd168ee7c056a" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:06.198941 containerd[1662]: 2025-12-16 12:13:06.176 [INFO][4682] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.63.196/26] block=192.168.63.192/26 handle="k8s-pod-network.331dbf82555dd53f9a52431d48153f318314cae367f2a9de274dd168ee7c056a" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:06.198941 containerd[1662]: 2025-12-16 12:13:06.176 [INFO][4682] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.63.196/26] handle="k8s-pod-network.331dbf82555dd53f9a52431d48153f318314cae367f2a9de274dd168ee7c056a" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:06.198941 containerd[1662]: 2025-12-16 12:13:06.176 [INFO][4682] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:13:06.198941 containerd[1662]: 2025-12-16 12:13:06.176 [INFO][4682] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.63.196/26] IPv6=[] ContainerID="331dbf82555dd53f9a52431d48153f318314cae367f2a9de274dd168ee7c056a" HandleID="k8s-pod-network.331dbf82555dd53f9a52431d48153f318314cae367f2a9de274dd168ee7c056a" Workload="ci--4547--0--0--4--c6e23b3406-k8s-calico--kube--controllers--5d8494b5b4--8w8m7-eth0" Dec 16 12:13:06.199430 containerd[1662]: 2025-12-16 12:13:06.178 [INFO][4654] cni-plugin/k8s.go 418: Populated endpoint ContainerID="331dbf82555dd53f9a52431d48153f318314cae367f2a9de274dd168ee7c056a" Namespace="calico-system" Pod="calico-kube-controllers-5d8494b5b4-8w8m7" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-calico--kube--controllers--5d8494b5b4--8w8m7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--4--c6e23b3406-k8s-calico--kube--controllers--5d8494b5b4--8w8m7-eth0", GenerateName:"calico-kube-controllers-5d8494b5b4-", Namespace:"calico-system", SelfLink:"", UID:"9aa83703-a98a-4b61-a3ff-9d2f2ed50abd", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 12, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5d8494b5b4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-4-c6e23b3406", ContainerID:"", Pod:"calico-kube-controllers-5d8494b5b4-8w8m7", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.63.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali168afa3f260", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:13:06.199430 containerd[1662]: 2025-12-16 12:13:06.178 [INFO][4654] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.63.196/32] ContainerID="331dbf82555dd53f9a52431d48153f318314cae367f2a9de274dd168ee7c056a" Namespace="calico-system" Pod="calico-kube-controllers-5d8494b5b4-8w8m7" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-calico--kube--controllers--5d8494b5b4--8w8m7-eth0" Dec 16 12:13:06.199430 containerd[1662]: 2025-12-16 12:13:06.178 [INFO][4654] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali168afa3f260 ContainerID="331dbf82555dd53f9a52431d48153f318314cae367f2a9de274dd168ee7c056a" Namespace="calico-system" Pod="calico-kube-controllers-5d8494b5b4-8w8m7" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-calico--kube--controllers--5d8494b5b4--8w8m7-eth0" Dec 16 12:13:06.199430 containerd[1662]: 2025-12-16 12:13:06.184 [INFO][4654] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="331dbf82555dd53f9a52431d48153f318314cae367f2a9de274dd168ee7c056a" Namespace="calico-system" Pod="calico-kube-controllers-5d8494b5b4-8w8m7" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-calico--kube--controllers--5d8494b5b4--8w8m7-eth0" Dec 16 12:13:06.199430 containerd[1662]: 2025-12-16 12:13:06.185 [INFO][4654] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="331dbf82555dd53f9a52431d48153f318314cae367f2a9de274dd168ee7c056a" Namespace="calico-system" Pod="calico-kube-controllers-5d8494b5b4-8w8m7" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-calico--kube--controllers--5d8494b5b4--8w8m7-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--4--c6e23b3406-k8s-calico--kube--controllers--5d8494b5b4--8w8m7-eth0", GenerateName:"calico-kube-controllers-5d8494b5b4-", Namespace:"calico-system", SelfLink:"", UID:"9aa83703-a98a-4b61-a3ff-9d2f2ed50abd", ResourceVersion:"819", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 12, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"5d8494b5b4", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-4-c6e23b3406", ContainerID:"331dbf82555dd53f9a52431d48153f318314cae367f2a9de274dd168ee7c056a", Pod:"calico-kube-controllers-5d8494b5b4-8w8m7", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.63.196/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali168afa3f260", MAC:"56:14:b7:77:98:b0", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:13:06.199430 containerd[1662]: 2025-12-16 12:13:06.197 [INFO][4654] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="331dbf82555dd53f9a52431d48153f318314cae367f2a9de274dd168ee7c056a" Namespace="calico-system" Pod="calico-kube-controllers-5d8494b5b4-8w8m7" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-calico--kube--controllers--5d8494b5b4--8w8m7-eth0" Dec 16 12:13:06.213000 audit[4708]: NETFILTER_CFG table=filter:133 family=2 entries=44 op=nft_register_chain pid=4708 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:13:06.213000 audit[4708]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=21952 a0=3 a1=ffffe861bb30 a2=0 a3=ffffb5700fa8 items=0 ppid=4292 pid=4708 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:06.213000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:13:06.224981 containerd[1662]: time="2025-12-16T12:13:06.224889991Z" level=info msg="connecting to shim 331dbf82555dd53f9a52431d48153f318314cae367f2a9de274dd168ee7c056a" address="unix:///run/containerd/s/ba71716ec65d2480c6225a02c5f8745577f54bd3eeb0c7d3914dacdcce7b9529" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:13:06.246177 kubelet[2916]: I1216 12:13:06.246094 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-jp97r" podStartSLOduration=42.24590889 podStartE2EDuration="42.24590889s" podCreationTimestamp="2025-12-16 12:12:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:13:06.239737513 +0000 UTC m=+49.252049476" watchObservedRunningTime="2025-12-16 12:13:06.24590889 +0000 UTC m=+49.258220813" Dec 16 12:13:06.247168 kubelet[2916]: I1216 12:13:06.247037 2916 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-674b8bbfcf-tdcxp" podStartSLOduration=42.247017333 podStartE2EDuration="42.247017333s" podCreationTimestamp="2025-12-16 12:12:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:13:06.227158878 +0000 UTC m=+49.239470801" watchObservedRunningTime="2025-12-16 12:13:06.247017333 +0000 UTC m=+49.259329296" Dec 16 12:13:06.265000 audit[4731]: NETFILTER_CFG table=filter:134 family=2 entries=20 op=nft_register_rule pid=4731 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:13:06.265000 audit[4731]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffc5d68410 a2=0 a3=1 items=0 ppid=3060 pid=4731 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:06.265000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:13:06.270000 audit[4731]: NETFILTER_CFG table=nat:135 family=2 entries=14 op=nft_register_rule pid=4731 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:13:06.270000 audit[4731]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffc5d68410 a2=0 a3=1 items=0 ppid=3060 pid=4731 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:06.270000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:13:06.276818 systemd[1]: Started cri-containerd-331dbf82555dd53f9a52431d48153f318314cae367f2a9de274dd168ee7c056a.scope - libcontainer container 331dbf82555dd53f9a52431d48153f318314cae367f2a9de274dd168ee7c056a. Dec 16 12:13:06.290000 audit[4748]: NETFILTER_CFG table=filter:136 family=2 entries=17 op=nft_register_rule pid=4748 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:13:06.290000 audit[4748]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffd4609080 a2=0 a3=1 items=0 ppid=3060 pid=4748 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:06.290000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:13:06.294000 audit: BPF prog-id=231 op=LOAD Dec 16 12:13:06.294000 audit: BPF prog-id=232 op=LOAD Dec 16 12:13:06.294000 audit[4728]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=4716 pid=4728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:06.294000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333316462663832353535646435336639613532343331643438313533 Dec 16 12:13:06.294000 audit: BPF prog-id=232 op=UNLOAD Dec 16 12:13:06.294000 audit[4728]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4716 pid=4728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:06.294000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333316462663832353535646435336639613532343331643438313533 Dec 16 12:13:06.294000 audit: BPF prog-id=233 op=LOAD Dec 16 12:13:06.294000 audit[4728]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=4716 pid=4728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:06.294000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333316462663832353535646435336639613532343331643438313533 Dec 16 12:13:06.294000 audit: BPF prog-id=234 op=LOAD Dec 16 12:13:06.294000 audit[4728]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a0168 a2=98 a3=0 items=0 ppid=4716 pid=4728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:06.294000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333316462663832353535646435336639613532343331643438313533 Dec 16 12:13:06.294000 audit: BPF prog-id=234 op=UNLOAD Dec 16 12:13:06.294000 audit[4728]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4716 pid=4728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:06.294000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333316462663832353535646435336639613532343331643438313533 Dec 16 12:13:06.294000 audit: BPF prog-id=233 op=UNLOAD Dec 16 12:13:06.294000 audit[4728]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4716 pid=4728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:06.294000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333316462663832353535646435336639613532343331643438313533 Dec 16 12:13:06.294000 audit: BPF prog-id=235 op=LOAD Dec 16 12:13:06.294000 audit[4728]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0648 a2=98 a3=0 items=0 ppid=4716 pid=4728 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:06.294000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3333316462663832353535646435336639613532343331643438313533 Dec 16 12:13:06.303090 systemd-networkd[1584]: calif218192257a: Link UP Dec 16 12:13:06.303597 systemd-networkd[1584]: calif218192257a: Gained carrier Dec 16 12:13:06.303000 audit[4748]: NETFILTER_CFG table=nat:137 family=2 entries=47 op=nft_register_chain pid=4748 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:13:06.303000 audit[4748]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=19860 a0=3 a1=ffffd4609080 a2=0 a3=1 items=0 ppid=3060 pid=4748 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:06.303000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:13:06.323755 containerd[1662]: 2025-12-16 12:13:06.118 [INFO][4657] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--4--c6e23b3406-k8s-calico--apiserver--585b8759c9--ggqjc-eth0 calico-apiserver-585b8759c9- calico-apiserver e4dab5b6-c490-4927-98c0-45033692e43c 823 0 2025-12-16 12:12:34 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:585b8759c9 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4547-0-0-4-c6e23b3406 calico-apiserver-585b8759c9-ggqjc eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calif218192257a [] [] }} ContainerID="13f067f32790c8430432c7f443f2b338b8c0eff44943c9461f0c16b793bd0cbb" Namespace="calico-apiserver" Pod="calico-apiserver-585b8759c9-ggqjc" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-calico--apiserver--585b8759c9--ggqjc-" Dec 16 12:13:06.323755 containerd[1662]: 2025-12-16 12:13:06.118 [INFO][4657] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="13f067f32790c8430432c7f443f2b338b8c0eff44943c9461f0c16b793bd0cbb" Namespace="calico-apiserver" Pod="calico-apiserver-585b8759c9-ggqjc" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-calico--apiserver--585b8759c9--ggqjc-eth0" Dec 16 12:13:06.323755 containerd[1662]: 2025-12-16 12:13:06.141 [INFO][4684] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="13f067f32790c8430432c7f443f2b338b8c0eff44943c9461f0c16b793bd0cbb" HandleID="k8s-pod-network.13f067f32790c8430432c7f443f2b338b8c0eff44943c9461f0c16b793bd0cbb" Workload="ci--4547--0--0--4--c6e23b3406-k8s-calico--apiserver--585b8759c9--ggqjc-eth0" Dec 16 12:13:06.323755 containerd[1662]: 2025-12-16 12:13:06.141 [INFO][4684] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="13f067f32790c8430432c7f443f2b338b8c0eff44943c9461f0c16b793bd0cbb" HandleID="k8s-pod-network.13f067f32790c8430432c7f443f2b338b8c0eff44943c9461f0c16b793bd0cbb" Workload="ci--4547--0--0--4--c6e23b3406-k8s-calico--apiserver--585b8759c9--ggqjc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001a3540), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4547-0-0-4-c6e23b3406", "pod":"calico-apiserver-585b8759c9-ggqjc", "timestamp":"2025-12-16 12:13:06.14177644 +0000 UTC"}, Hostname:"ci-4547-0-0-4-c6e23b3406", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:13:06.323755 containerd[1662]: 2025-12-16 12:13:06.142 [INFO][4684] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:13:06.323755 containerd[1662]: 2025-12-16 12:13:06.176 [INFO][4684] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:13:06.323755 containerd[1662]: 2025-12-16 12:13:06.176 [INFO][4684] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-4-c6e23b3406' Dec 16 12:13:06.323755 containerd[1662]: 2025-12-16 12:13:06.254 [INFO][4684] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.13f067f32790c8430432c7f443f2b338b8c0eff44943c9461f0c16b793bd0cbb" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:06.323755 containerd[1662]: 2025-12-16 12:13:06.264 [INFO][4684] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:06.323755 containerd[1662]: 2025-12-16 12:13:06.273 [INFO][4684] ipam/ipam.go 511: Trying affinity for 192.168.63.192/26 host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:06.323755 containerd[1662]: 2025-12-16 12:13:06.275 [INFO][4684] ipam/ipam.go 158: Attempting to load block cidr=192.168.63.192/26 host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:06.323755 containerd[1662]: 2025-12-16 12:13:06.280 [INFO][4684] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.63.192/26 host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:06.323755 containerd[1662]: 2025-12-16 12:13:06.280 [INFO][4684] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.63.192/26 handle="k8s-pod-network.13f067f32790c8430432c7f443f2b338b8c0eff44943c9461f0c16b793bd0cbb" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:06.323755 containerd[1662]: 2025-12-16 12:13:06.282 [INFO][4684] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.13f067f32790c8430432c7f443f2b338b8c0eff44943c9461f0c16b793bd0cbb Dec 16 12:13:06.323755 containerd[1662]: 2025-12-16 12:13:06.289 [INFO][4684] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.63.192/26 handle="k8s-pod-network.13f067f32790c8430432c7f443f2b338b8c0eff44943c9461f0c16b793bd0cbb" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:06.323755 containerd[1662]: 2025-12-16 12:13:06.297 [INFO][4684] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.63.197/26] block=192.168.63.192/26 handle="k8s-pod-network.13f067f32790c8430432c7f443f2b338b8c0eff44943c9461f0c16b793bd0cbb" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:06.323755 containerd[1662]: 2025-12-16 12:13:06.298 [INFO][4684] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.63.197/26] handle="k8s-pod-network.13f067f32790c8430432c7f443f2b338b8c0eff44943c9461f0c16b793bd0cbb" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:06.323755 containerd[1662]: 2025-12-16 12:13:06.298 [INFO][4684] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:13:06.323755 containerd[1662]: 2025-12-16 12:13:06.298 [INFO][4684] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.63.197/26] IPv6=[] ContainerID="13f067f32790c8430432c7f443f2b338b8c0eff44943c9461f0c16b793bd0cbb" HandleID="k8s-pod-network.13f067f32790c8430432c7f443f2b338b8c0eff44943c9461f0c16b793bd0cbb" Workload="ci--4547--0--0--4--c6e23b3406-k8s-calico--apiserver--585b8759c9--ggqjc-eth0" Dec 16 12:13:06.324229 containerd[1662]: 2025-12-16 12:13:06.300 [INFO][4657] cni-plugin/k8s.go 418: Populated endpoint ContainerID="13f067f32790c8430432c7f443f2b338b8c0eff44943c9461f0c16b793bd0cbb" Namespace="calico-apiserver" Pod="calico-apiserver-585b8759c9-ggqjc" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-calico--apiserver--585b8759c9--ggqjc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--4--c6e23b3406-k8s-calico--apiserver--585b8759c9--ggqjc-eth0", GenerateName:"calico-apiserver-585b8759c9-", Namespace:"calico-apiserver", SelfLink:"", UID:"e4dab5b6-c490-4927-98c0-45033692e43c", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 12, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"585b8759c9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-4-c6e23b3406", ContainerID:"", Pod:"calico-apiserver-585b8759c9-ggqjc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.63.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif218192257a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:13:06.324229 containerd[1662]: 2025-12-16 12:13:06.300 [INFO][4657] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.63.197/32] ContainerID="13f067f32790c8430432c7f443f2b338b8c0eff44943c9461f0c16b793bd0cbb" Namespace="calico-apiserver" Pod="calico-apiserver-585b8759c9-ggqjc" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-calico--apiserver--585b8759c9--ggqjc-eth0" Dec 16 12:13:06.324229 containerd[1662]: 2025-12-16 12:13:06.300 [INFO][4657] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calif218192257a ContainerID="13f067f32790c8430432c7f443f2b338b8c0eff44943c9461f0c16b793bd0cbb" Namespace="calico-apiserver" Pod="calico-apiserver-585b8759c9-ggqjc" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-calico--apiserver--585b8759c9--ggqjc-eth0" Dec 16 12:13:06.324229 containerd[1662]: 2025-12-16 12:13:06.304 [INFO][4657] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="13f067f32790c8430432c7f443f2b338b8c0eff44943c9461f0c16b793bd0cbb" Namespace="calico-apiserver" Pod="calico-apiserver-585b8759c9-ggqjc" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-calico--apiserver--585b8759c9--ggqjc-eth0" Dec 16 12:13:06.324229 containerd[1662]: 2025-12-16 12:13:06.304 [INFO][4657] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="13f067f32790c8430432c7f443f2b338b8c0eff44943c9461f0c16b793bd0cbb" Namespace="calico-apiserver" Pod="calico-apiserver-585b8759c9-ggqjc" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-calico--apiserver--585b8759c9--ggqjc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--4--c6e23b3406-k8s-calico--apiserver--585b8759c9--ggqjc-eth0", GenerateName:"calico-apiserver-585b8759c9-", Namespace:"calico-apiserver", SelfLink:"", UID:"e4dab5b6-c490-4927-98c0-45033692e43c", ResourceVersion:"823", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 12, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"585b8759c9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-4-c6e23b3406", ContainerID:"13f067f32790c8430432c7f443f2b338b8c0eff44943c9461f0c16b793bd0cbb", Pod:"calico-apiserver-585b8759c9-ggqjc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.63.197/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calif218192257a", MAC:"62:39:35:e1:8c:aa", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:13:06.324229 containerd[1662]: 2025-12-16 12:13:06.318 [INFO][4657] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="13f067f32790c8430432c7f443f2b338b8c0eff44943c9461f0c16b793bd0cbb" Namespace="calico-apiserver" Pod="calico-apiserver-585b8759c9-ggqjc" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-calico--apiserver--585b8759c9--ggqjc-eth0" Dec 16 12:13:06.339931 containerd[1662]: time="2025-12-16T12:13:06.339691751Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-5d8494b5b4-8w8m7,Uid:9aa83703-a98a-4b61-a3ff-9d2f2ed50abd,Namespace:calico-system,Attempt:0,} returns sandbox id \"331dbf82555dd53f9a52431d48153f318314cae367f2a9de274dd168ee7c056a\"" Dec 16 12:13:06.341728 containerd[1662]: time="2025-12-16T12:13:06.341693236Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:13:06.347000 audit[4766]: NETFILTER_CFG table=filter:138 family=2 entries=62 op=nft_register_chain pid=4766 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:13:06.347000 audit[4766]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=31772 a0=3 a1=fffff14c3e70 a2=0 a3=ffffb3fc5fa8 items=0 ppid=4292 pid=4766 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:06.347000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:13:06.364498 containerd[1662]: time="2025-12-16T12:13:06.364458540Z" level=info msg="connecting to shim 13f067f32790c8430432c7f443f2b338b8c0eff44943c9461f0c16b793bd0cbb" address="unix:///run/containerd/s/8ab5cb15c4f4da112b29b72bda30fd50cc8aa8bb6076d696e759e13c4703c7fe" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:13:06.393018 systemd[1]: Started cri-containerd-13f067f32790c8430432c7f443f2b338b8c0eff44943c9461f0c16b793bd0cbb.scope - libcontainer container 13f067f32790c8430432c7f443f2b338b8c0eff44943c9461f0c16b793bd0cbb. Dec 16 12:13:06.407000 audit: BPF prog-id=236 op=LOAD Dec 16 12:13:06.409000 audit: BPF prog-id=237 op=LOAD Dec 16 12:13:06.409000 audit[4786]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4775 pid=4786 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:06.409000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3133663036376633323739306338343330343332633766343433663262 Dec 16 12:13:06.409000 audit: BPF prog-id=237 op=UNLOAD Dec 16 12:13:06.409000 audit[4786]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4775 pid=4786 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:06.409000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3133663036376633323739306338343330343332633766343433663262 Dec 16 12:13:06.409000 audit: BPF prog-id=238 op=LOAD Dec 16 12:13:06.409000 audit[4786]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4775 pid=4786 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:06.409000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3133663036376633323739306338343330343332633766343433663262 Dec 16 12:13:06.409000 audit: BPF prog-id=239 op=LOAD Dec 16 12:13:06.409000 audit[4786]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4775 pid=4786 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:06.409000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3133663036376633323739306338343330343332633766343433663262 Dec 16 12:13:06.409000 audit: BPF prog-id=239 op=UNLOAD Dec 16 12:13:06.409000 audit[4786]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4775 pid=4786 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:06.409000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3133663036376633323739306338343330343332633766343433663262 Dec 16 12:13:06.409000 audit: BPF prog-id=238 op=UNLOAD Dec 16 12:13:06.409000 audit[4786]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4775 pid=4786 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:06.409000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3133663036376633323739306338343330343332633766343433663262 Dec 16 12:13:06.409000 audit: BPF prog-id=240 op=LOAD Dec 16 12:13:06.409000 audit[4786]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4775 pid=4786 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:06.409000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3133663036376633323739306338343330343332633766343433663262 Dec 16 12:13:06.431765 containerd[1662]: time="2025-12-16T12:13:06.431723167Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-585b8759c9-ggqjc,Uid:e4dab5b6-c490-4927-98c0-45033692e43c,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"13f067f32790c8430432c7f443f2b338b8c0eff44943c9461f0c16b793bd0cbb\"" Dec 16 12:13:06.680831 systemd-networkd[1584]: cali4bb7a4c22e7: Gained IPv6LL Dec 16 12:13:06.694888 containerd[1662]: time="2025-12-16T12:13:06.694840660Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:13:06.696016 containerd[1662]: time="2025-12-16T12:13:06.695970423Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:13:06.696079 containerd[1662]: time="2025-12-16T12:13:06.696048543Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:13:06.696333 kubelet[2916]: E1216 12:13:06.696271 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:13:06.696333 kubelet[2916]: E1216 12:13:06.696325 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:13:06.696659 containerd[1662]: time="2025-12-16T12:13:06.696609385Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:13:06.697829 kubelet[2916]: E1216 12:13:06.697747 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xgkkb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-5d8494b5b4-8w8m7_calico-system(9aa83703-a98a-4b61-a3ff-9d2f2ed50abd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:13:06.699063 kubelet[2916]: E1216 12:13:06.698972 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8494b5b4-8w8m7" podUID="9aa83703-a98a-4b61-a3ff-9d2f2ed50abd" Dec 16 12:13:06.744880 systemd-networkd[1584]: cali7f3c962e3ff: Gained IPv6LL Dec 16 12:13:07.042066 containerd[1662]: time="2025-12-16T12:13:07.041865186Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:13:07.044066 containerd[1662]: time="2025-12-16T12:13:07.044029352Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:13:07.044118 containerd[1662]: time="2025-12-16T12:13:07.044068393Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:13:07.044286 kubelet[2916]: E1216 12:13:07.044240 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:13:07.044330 kubelet[2916]: E1216 12:13:07.044287 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:13:07.044462 kubelet[2916]: E1216 12:13:07.044420 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hsckl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-585b8759c9-ggqjc_calico-apiserver(e4dab5b6-c490-4927-98c0-45033692e43c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:13:07.045729 kubelet[2916]: E1216 12:13:07.045691 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-ggqjc" podUID="e4dab5b6-c490-4927-98c0-45033692e43c" Dec 16 12:13:07.222673 kubelet[2916]: E1216 12:13:07.221173 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-ggqjc" podUID="e4dab5b6-c490-4927-98c0-45033692e43c" Dec 16 12:13:07.222673 kubelet[2916]: E1216 12:13:07.222606 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8494b5b4-8w8m7" podUID="9aa83703-a98a-4b61-a3ff-9d2f2ed50abd" Dec 16 12:13:07.242000 audit[4814]: NETFILTER_CFG table=filter:139 family=2 entries=14 op=nft_register_rule pid=4814 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:13:07.242000 audit[4814]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffda2c6b00 a2=0 a3=1 items=0 ppid=3060 pid=4814 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:07.242000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:13:07.249000 audit[4814]: NETFILTER_CFG table=nat:140 family=2 entries=20 op=nft_register_rule pid=4814 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:13:07.249000 audit[4814]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffda2c6b00 a2=0 a3=1 items=0 ppid=3060 pid=4814 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:07.249000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:13:07.596507 kubelet[2916]: I1216 12:13:07.596460 2916 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 12:13:07.704791 systemd-networkd[1584]: cali168afa3f260: Gained IPv6LL Dec 16 12:13:08.065118 containerd[1662]: time="2025-12-16T12:13:08.065064796Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-7qscd,Uid:a0cf5bad-dc9b-4b29-b409-00ae25b08c7c,Namespace:calico-system,Attempt:0,}" Dec 16 12:13:08.065430 containerd[1662]: time="2025-12-16T12:13:08.065124036Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-585b8759c9-djtxq,Uid:fd2217b9-d056-43fe-a117-755ac4fcec22,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:13:08.152903 systemd-networkd[1584]: calif218192257a: Gained IPv6LL Dec 16 12:13:08.223453 systemd-networkd[1584]: cali2b4e2c8fb5a: Link UP Dec 16 12:13:08.224432 systemd-networkd[1584]: cali2b4e2c8fb5a: Gained carrier Dec 16 12:13:08.232469 kubelet[2916]: E1216 12:13:08.232335 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-ggqjc" podUID="e4dab5b6-c490-4927-98c0-45033692e43c" Dec 16 12:13:08.237728 kubelet[2916]: E1216 12:13:08.236720 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8494b5b4-8w8m7" podUID="9aa83703-a98a-4b61-a3ff-9d2f2ed50abd" Dec 16 12:13:08.251729 containerd[1662]: 2025-12-16 12:13:08.124 [INFO][4865] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--4--c6e23b3406-k8s-goldmane--666569f655--7qscd-eth0 goldmane-666569f655- calico-system a0cf5bad-dc9b-4b29-b409-00ae25b08c7c 822 0 2025-12-16 12:12:40 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:666569f655 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4547-0-0-4-c6e23b3406 goldmane-666569f655-7qscd eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] cali2b4e2c8fb5a [] [] }} ContainerID="73644e18179cbe96b7d45c9f674e1030f6a85e2e88c8616f432c2413b3687310" Namespace="calico-system" Pod="goldmane-666569f655-7qscd" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-goldmane--666569f655--7qscd-" Dec 16 12:13:08.251729 containerd[1662]: 2025-12-16 12:13:08.124 [INFO][4865] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="73644e18179cbe96b7d45c9f674e1030f6a85e2e88c8616f432c2413b3687310" Namespace="calico-system" Pod="goldmane-666569f655-7qscd" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-goldmane--666569f655--7qscd-eth0" Dec 16 12:13:08.251729 containerd[1662]: 2025-12-16 12:13:08.154 [INFO][4895] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="73644e18179cbe96b7d45c9f674e1030f6a85e2e88c8616f432c2413b3687310" HandleID="k8s-pod-network.73644e18179cbe96b7d45c9f674e1030f6a85e2e88c8616f432c2413b3687310" Workload="ci--4547--0--0--4--c6e23b3406-k8s-goldmane--666569f655--7qscd-eth0" Dec 16 12:13:08.251729 containerd[1662]: 2025-12-16 12:13:08.154 [INFO][4895] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="73644e18179cbe96b7d45c9f674e1030f6a85e2e88c8616f432c2413b3687310" HandleID="k8s-pod-network.73644e18179cbe96b7d45c9f674e1030f6a85e2e88c8616f432c2413b3687310" Workload="ci--4547--0--0--4--c6e23b3406-k8s-goldmane--666569f655--7qscd-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40001a15f0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-4-c6e23b3406", "pod":"goldmane-666569f655-7qscd", "timestamp":"2025-12-16 12:13:08.154181004 +0000 UTC"}, Hostname:"ci-4547-0-0-4-c6e23b3406", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:13:08.251729 containerd[1662]: 2025-12-16 12:13:08.154 [INFO][4895] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:13:08.251729 containerd[1662]: 2025-12-16 12:13:08.154 [INFO][4895] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:13:08.251729 containerd[1662]: 2025-12-16 12:13:08.154 [INFO][4895] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-4-c6e23b3406' Dec 16 12:13:08.251729 containerd[1662]: 2025-12-16 12:13:08.175 [INFO][4895] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.73644e18179cbe96b7d45c9f674e1030f6a85e2e88c8616f432c2413b3687310" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:08.251729 containerd[1662]: 2025-12-16 12:13:08.182 [INFO][4895] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:08.251729 containerd[1662]: 2025-12-16 12:13:08.189 [INFO][4895] ipam/ipam.go 511: Trying affinity for 192.168.63.192/26 host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:08.251729 containerd[1662]: 2025-12-16 12:13:08.192 [INFO][4895] ipam/ipam.go 158: Attempting to load block cidr=192.168.63.192/26 host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:08.251729 containerd[1662]: 2025-12-16 12:13:08.195 [INFO][4895] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.63.192/26 host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:08.251729 containerd[1662]: 2025-12-16 12:13:08.195 [INFO][4895] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.63.192/26 handle="k8s-pod-network.73644e18179cbe96b7d45c9f674e1030f6a85e2e88c8616f432c2413b3687310" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:08.251729 containerd[1662]: 2025-12-16 12:13:08.197 [INFO][4895] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.73644e18179cbe96b7d45c9f674e1030f6a85e2e88c8616f432c2413b3687310 Dec 16 12:13:08.251729 containerd[1662]: 2025-12-16 12:13:08.203 [INFO][4895] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.63.192/26 handle="k8s-pod-network.73644e18179cbe96b7d45c9f674e1030f6a85e2e88c8616f432c2413b3687310" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:08.251729 containerd[1662]: 2025-12-16 12:13:08.211 [INFO][4895] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.63.198/26] block=192.168.63.192/26 handle="k8s-pod-network.73644e18179cbe96b7d45c9f674e1030f6a85e2e88c8616f432c2413b3687310" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:08.251729 containerd[1662]: 2025-12-16 12:13:08.211 [INFO][4895] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.63.198/26] handle="k8s-pod-network.73644e18179cbe96b7d45c9f674e1030f6a85e2e88c8616f432c2413b3687310" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:08.251729 containerd[1662]: 2025-12-16 12:13:08.211 [INFO][4895] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:13:08.251729 containerd[1662]: 2025-12-16 12:13:08.211 [INFO][4895] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.63.198/26] IPv6=[] ContainerID="73644e18179cbe96b7d45c9f674e1030f6a85e2e88c8616f432c2413b3687310" HandleID="k8s-pod-network.73644e18179cbe96b7d45c9f674e1030f6a85e2e88c8616f432c2413b3687310" Workload="ci--4547--0--0--4--c6e23b3406-k8s-goldmane--666569f655--7qscd-eth0" Dec 16 12:13:08.252859 containerd[1662]: 2025-12-16 12:13:08.215 [INFO][4865] cni-plugin/k8s.go 418: Populated endpoint ContainerID="73644e18179cbe96b7d45c9f674e1030f6a85e2e88c8616f432c2413b3687310" Namespace="calico-system" Pod="goldmane-666569f655-7qscd" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-goldmane--666569f655--7qscd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--4--c6e23b3406-k8s-goldmane--666569f655--7qscd-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"a0cf5bad-dc9b-4b29-b409-00ae25b08c7c", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 12, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-4-c6e23b3406", ContainerID:"", Pod:"goldmane-666569f655-7qscd", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.63.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali2b4e2c8fb5a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:13:08.252859 containerd[1662]: 2025-12-16 12:13:08.216 [INFO][4865] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.63.198/32] ContainerID="73644e18179cbe96b7d45c9f674e1030f6a85e2e88c8616f432c2413b3687310" Namespace="calico-system" Pod="goldmane-666569f655-7qscd" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-goldmane--666569f655--7qscd-eth0" Dec 16 12:13:08.252859 containerd[1662]: 2025-12-16 12:13:08.216 [INFO][4865] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2b4e2c8fb5a ContainerID="73644e18179cbe96b7d45c9f674e1030f6a85e2e88c8616f432c2413b3687310" Namespace="calico-system" Pod="goldmane-666569f655-7qscd" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-goldmane--666569f655--7qscd-eth0" Dec 16 12:13:08.252859 containerd[1662]: 2025-12-16 12:13:08.225 [INFO][4865] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="73644e18179cbe96b7d45c9f674e1030f6a85e2e88c8616f432c2413b3687310" Namespace="calico-system" Pod="goldmane-666569f655-7qscd" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-goldmane--666569f655--7qscd-eth0" Dec 16 12:13:08.252859 containerd[1662]: 2025-12-16 12:13:08.225 [INFO][4865] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="73644e18179cbe96b7d45c9f674e1030f6a85e2e88c8616f432c2413b3687310" Namespace="calico-system" Pod="goldmane-666569f655-7qscd" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-goldmane--666569f655--7qscd-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--4--c6e23b3406-k8s-goldmane--666569f655--7qscd-eth0", GenerateName:"goldmane-666569f655-", Namespace:"calico-system", SelfLink:"", UID:"a0cf5bad-dc9b-4b29-b409-00ae25b08c7c", ResourceVersion:"822", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 12, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"666569f655", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-4-c6e23b3406", ContainerID:"73644e18179cbe96b7d45c9f674e1030f6a85e2e88c8616f432c2413b3687310", Pod:"goldmane-666569f655-7qscd", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.63.198/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"cali2b4e2c8fb5a", MAC:"4a:d5:6e:e6:8b:8f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:13:08.252859 containerd[1662]: 2025-12-16 12:13:08.246 [INFO][4865] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="73644e18179cbe96b7d45c9f674e1030f6a85e2e88c8616f432c2413b3687310" Namespace="calico-system" Pod="goldmane-666569f655-7qscd" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-goldmane--666569f655--7qscd-eth0" Dec 16 12:13:08.283824 containerd[1662]: time="2025-12-16T12:13:08.283742085Z" level=info msg="connecting to shim 73644e18179cbe96b7d45c9f674e1030f6a85e2e88c8616f432c2413b3687310" address="unix:///run/containerd/s/388ffa579e96aac4af21e1a9644b3a22650a0e1d94ba3c569fd85fe0e36cfd27" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:13:08.287000 audit[4931]: NETFILTER_CFG table=filter:141 family=2 entries=60 op=nft_register_chain pid=4931 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:13:08.287000 audit[4931]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=29932 a0=3 a1=fffff9d951e0 a2=0 a3=ffff7f59cfa8 items=0 ppid=4292 pid=4931 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:08.287000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:13:08.326881 systemd[1]: Started cri-containerd-73644e18179cbe96b7d45c9f674e1030f6a85e2e88c8616f432c2413b3687310.scope - libcontainer container 73644e18179cbe96b7d45c9f674e1030f6a85e2e88c8616f432c2413b3687310. Dec 16 12:13:08.332192 systemd-networkd[1584]: cali0b316a3bae6: Link UP Dec 16 12:13:08.332648 systemd-networkd[1584]: cali0b316a3bae6: Gained carrier Dec 16 12:13:08.349769 containerd[1662]: 2025-12-16 12:13:08.131 [INFO][4876] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--4--c6e23b3406-k8s-calico--apiserver--585b8759c9--djtxq-eth0 calico-apiserver-585b8759c9- calico-apiserver fd2217b9-d056-43fe-a117-755ac4fcec22 820 0 2025-12-16 12:12:34 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:585b8759c9 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4547-0-0-4-c6e23b3406 calico-apiserver-585b8759c9-djtxq eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali0b316a3bae6 [] [] }} ContainerID="af941778fb37d071641033931d461878edcde1bf3915869aaabf2ed8f69fe199" Namespace="calico-apiserver" Pod="calico-apiserver-585b8759c9-djtxq" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-calico--apiserver--585b8759c9--djtxq-" Dec 16 12:13:08.349769 containerd[1662]: 2025-12-16 12:13:08.131 [INFO][4876] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="af941778fb37d071641033931d461878edcde1bf3915869aaabf2ed8f69fe199" Namespace="calico-apiserver" Pod="calico-apiserver-585b8759c9-djtxq" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-calico--apiserver--585b8759c9--djtxq-eth0" Dec 16 12:13:08.349769 containerd[1662]: 2025-12-16 12:13:08.178 [INFO][4901] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="af941778fb37d071641033931d461878edcde1bf3915869aaabf2ed8f69fe199" HandleID="k8s-pod-network.af941778fb37d071641033931d461878edcde1bf3915869aaabf2ed8f69fe199" Workload="ci--4547--0--0--4--c6e23b3406-k8s-calico--apiserver--585b8759c9--djtxq-eth0" Dec 16 12:13:08.349769 containerd[1662]: 2025-12-16 12:13:08.178 [INFO][4901] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="af941778fb37d071641033931d461878edcde1bf3915869aaabf2ed8f69fe199" HandleID="k8s-pod-network.af941778fb37d071641033931d461878edcde1bf3915869aaabf2ed8f69fe199" Workload="ci--4547--0--0--4--c6e23b3406-k8s-calico--apiserver--585b8759c9--djtxq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004c760), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4547-0-0-4-c6e23b3406", "pod":"calico-apiserver-585b8759c9-djtxq", "timestamp":"2025-12-16 12:13:08.17801115 +0000 UTC"}, Hostname:"ci-4547-0-0-4-c6e23b3406", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:13:08.349769 containerd[1662]: 2025-12-16 12:13:08.178 [INFO][4901] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:13:08.349769 containerd[1662]: 2025-12-16 12:13:08.211 [INFO][4901] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:13:08.349769 containerd[1662]: 2025-12-16 12:13:08.211 [INFO][4901] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-4-c6e23b3406' Dec 16 12:13:08.349769 containerd[1662]: 2025-12-16 12:13:08.279 [INFO][4901] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.af941778fb37d071641033931d461878edcde1bf3915869aaabf2ed8f69fe199" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:08.349769 containerd[1662]: 2025-12-16 12:13:08.288 [INFO][4901] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:08.349769 containerd[1662]: 2025-12-16 12:13:08.303 [INFO][4901] ipam/ipam.go 511: Trying affinity for 192.168.63.192/26 host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:08.349769 containerd[1662]: 2025-12-16 12:13:08.306 [INFO][4901] ipam/ipam.go 158: Attempting to load block cidr=192.168.63.192/26 host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:08.349769 containerd[1662]: 2025-12-16 12:13:08.310 [INFO][4901] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.63.192/26 host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:08.349769 containerd[1662]: 2025-12-16 12:13:08.311 [INFO][4901] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.63.192/26 handle="k8s-pod-network.af941778fb37d071641033931d461878edcde1bf3915869aaabf2ed8f69fe199" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:08.349769 containerd[1662]: 2025-12-16 12:13:08.313 [INFO][4901] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.af941778fb37d071641033931d461878edcde1bf3915869aaabf2ed8f69fe199 Dec 16 12:13:08.349769 containerd[1662]: 2025-12-16 12:13:08.318 [INFO][4901] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.63.192/26 handle="k8s-pod-network.af941778fb37d071641033931d461878edcde1bf3915869aaabf2ed8f69fe199" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:08.349769 containerd[1662]: 2025-12-16 12:13:08.326 [INFO][4901] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.63.199/26] block=192.168.63.192/26 handle="k8s-pod-network.af941778fb37d071641033931d461878edcde1bf3915869aaabf2ed8f69fe199" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:08.349769 containerd[1662]: 2025-12-16 12:13:08.326 [INFO][4901] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.63.199/26] handle="k8s-pod-network.af941778fb37d071641033931d461878edcde1bf3915869aaabf2ed8f69fe199" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:08.349769 containerd[1662]: 2025-12-16 12:13:08.326 [INFO][4901] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:13:08.349769 containerd[1662]: 2025-12-16 12:13:08.326 [INFO][4901] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.63.199/26] IPv6=[] ContainerID="af941778fb37d071641033931d461878edcde1bf3915869aaabf2ed8f69fe199" HandleID="k8s-pod-network.af941778fb37d071641033931d461878edcde1bf3915869aaabf2ed8f69fe199" Workload="ci--4547--0--0--4--c6e23b3406-k8s-calico--apiserver--585b8759c9--djtxq-eth0" Dec 16 12:13:08.350237 containerd[1662]: 2025-12-16 12:13:08.330 [INFO][4876] cni-plugin/k8s.go 418: Populated endpoint ContainerID="af941778fb37d071641033931d461878edcde1bf3915869aaabf2ed8f69fe199" Namespace="calico-apiserver" Pod="calico-apiserver-585b8759c9-djtxq" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-calico--apiserver--585b8759c9--djtxq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--4--c6e23b3406-k8s-calico--apiserver--585b8759c9--djtxq-eth0", GenerateName:"calico-apiserver-585b8759c9-", Namespace:"calico-apiserver", SelfLink:"", UID:"fd2217b9-d056-43fe-a117-755ac4fcec22", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 12, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"585b8759c9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-4-c6e23b3406", ContainerID:"", Pod:"calico-apiserver-585b8759c9-djtxq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.63.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali0b316a3bae6", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:13:08.350237 containerd[1662]: 2025-12-16 12:13:08.330 [INFO][4876] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.63.199/32] ContainerID="af941778fb37d071641033931d461878edcde1bf3915869aaabf2ed8f69fe199" Namespace="calico-apiserver" Pod="calico-apiserver-585b8759c9-djtxq" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-calico--apiserver--585b8759c9--djtxq-eth0" Dec 16 12:13:08.350237 containerd[1662]: 2025-12-16 12:13:08.330 [INFO][4876] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali0b316a3bae6 ContainerID="af941778fb37d071641033931d461878edcde1bf3915869aaabf2ed8f69fe199" Namespace="calico-apiserver" Pod="calico-apiserver-585b8759c9-djtxq" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-calico--apiserver--585b8759c9--djtxq-eth0" Dec 16 12:13:08.350237 containerd[1662]: 2025-12-16 12:13:08.333 [INFO][4876] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="af941778fb37d071641033931d461878edcde1bf3915869aaabf2ed8f69fe199" Namespace="calico-apiserver" Pod="calico-apiserver-585b8759c9-djtxq" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-calico--apiserver--585b8759c9--djtxq-eth0" Dec 16 12:13:08.350237 containerd[1662]: 2025-12-16 12:13:08.334 [INFO][4876] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="af941778fb37d071641033931d461878edcde1bf3915869aaabf2ed8f69fe199" Namespace="calico-apiserver" Pod="calico-apiserver-585b8759c9-djtxq" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-calico--apiserver--585b8759c9--djtxq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--4--c6e23b3406-k8s-calico--apiserver--585b8759c9--djtxq-eth0", GenerateName:"calico-apiserver-585b8759c9-", Namespace:"calico-apiserver", SelfLink:"", UID:"fd2217b9-d056-43fe-a117-755ac4fcec22", ResourceVersion:"820", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 12, 34, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"585b8759c9", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-4-c6e23b3406", ContainerID:"af941778fb37d071641033931d461878edcde1bf3915869aaabf2ed8f69fe199", Pod:"calico-apiserver-585b8759c9-djtxq", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.63.199/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali0b316a3bae6", MAC:"6e:2a:d7:5a:fb:7d", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:13:08.350237 containerd[1662]: 2025-12-16 12:13:08.347 [INFO][4876] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="af941778fb37d071641033931d461878edcde1bf3915869aaabf2ed8f69fe199" Namespace="calico-apiserver" Pod="calico-apiserver-585b8759c9-djtxq" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-calico--apiserver--585b8759c9--djtxq-eth0" Dec 16 12:13:08.355000 audit: BPF prog-id=241 op=LOAD Dec 16 12:13:08.355000 audit: BPF prog-id=242 op=LOAD Dec 16 12:13:08.355000 audit[4942]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=4930 pid=4942 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:08.355000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733363434653138313739636265393662376434356339663637346531 Dec 16 12:13:08.355000 audit: BPF prog-id=242 op=UNLOAD Dec 16 12:13:08.355000 audit[4942]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4930 pid=4942 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:08.355000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733363434653138313739636265393662376434356339663637346531 Dec 16 12:13:08.356000 audit: BPF prog-id=243 op=LOAD Dec 16 12:13:08.356000 audit[4942]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=4930 pid=4942 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:08.356000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733363434653138313739636265393662376434356339663637346531 Dec 16 12:13:08.356000 audit: BPF prog-id=244 op=LOAD Dec 16 12:13:08.356000 audit[4942]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=4930 pid=4942 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:08.356000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733363434653138313739636265393662376434356339663637346531 Dec 16 12:13:08.356000 audit: BPF prog-id=244 op=UNLOAD Dec 16 12:13:08.356000 audit[4942]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4930 pid=4942 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:08.356000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733363434653138313739636265393662376434356339663637346531 Dec 16 12:13:08.356000 audit: BPF prog-id=243 op=UNLOAD Dec 16 12:13:08.356000 audit[4942]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4930 pid=4942 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:08.356000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733363434653138313739636265393662376434356339663637346531 Dec 16 12:13:08.356000 audit: BPF prog-id=245 op=LOAD Dec 16 12:13:08.356000 audit[4942]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=4930 pid=4942 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:08.356000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3733363434653138313739636265393662376434356339663637346531 Dec 16 12:13:08.366000 audit[4968]: NETFILTER_CFG table=filter:142 family=2 entries=57 op=nft_register_chain pid=4968 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:13:08.366000 audit[4968]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=27828 a0=3 a1=ffffc48f0b00 a2=0 a3=ffff9818dfa8 items=0 ppid=4292 pid=4968 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:08.366000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:13:08.384205 containerd[1662]: time="2025-12-16T12:13:08.384142845Z" level=info msg="connecting to shim af941778fb37d071641033931d461878edcde1bf3915869aaabf2ed8f69fe199" address="unix:///run/containerd/s/85734d8d114cb4bc7a7e32727bcfe2cbccfc058a595d771736a2bc75c72fb218" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:13:08.394972 containerd[1662]: time="2025-12-16T12:13:08.394906594Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-666569f655-7qscd,Uid:a0cf5bad-dc9b-4b29-b409-00ae25b08c7c,Namespace:calico-system,Attempt:0,} returns sandbox id \"73644e18179cbe96b7d45c9f674e1030f6a85e2e88c8616f432c2413b3687310\"" Dec 16 12:13:08.397001 containerd[1662]: time="2025-12-16T12:13:08.396972240Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:13:08.417871 systemd[1]: Started cri-containerd-af941778fb37d071641033931d461878edcde1bf3915869aaabf2ed8f69fe199.scope - libcontainer container af941778fb37d071641033931d461878edcde1bf3915869aaabf2ed8f69fe199. Dec 16 12:13:08.427000 audit: BPF prog-id=246 op=LOAD Dec 16 12:13:08.429000 audit: BPF prog-id=247 op=LOAD Dec 16 12:13:08.429000 audit[4997]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=4979 pid=4997 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:08.429000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6166393431373738666233376430373136343130333339333164343631 Dec 16 12:13:08.429000 audit: BPF prog-id=247 op=UNLOAD Dec 16 12:13:08.429000 audit[4997]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4979 pid=4997 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:08.429000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6166393431373738666233376430373136343130333339333164343631 Dec 16 12:13:08.429000 audit: BPF prog-id=248 op=LOAD Dec 16 12:13:08.429000 audit[4997]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=4979 pid=4997 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:08.429000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6166393431373738666233376430373136343130333339333164343631 Dec 16 12:13:08.430000 audit: BPF prog-id=249 op=LOAD Dec 16 12:13:08.430000 audit[4997]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=4979 pid=4997 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:08.430000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6166393431373738666233376430373136343130333339333164343631 Dec 16 12:13:08.430000 audit: BPF prog-id=249 op=UNLOAD Dec 16 12:13:08.430000 audit[4997]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4979 pid=4997 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:08.430000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6166393431373738666233376430373136343130333339333164343631 Dec 16 12:13:08.430000 audit: BPF prog-id=248 op=UNLOAD Dec 16 12:13:08.430000 audit[4997]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4979 pid=4997 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:08.430000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6166393431373738666233376430373136343130333339333164343631 Dec 16 12:13:08.430000 audit: BPF prog-id=250 op=LOAD Dec 16 12:13:08.430000 audit[4997]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=4979 pid=4997 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:08.430000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6166393431373738666233376430373136343130333339333164343631 Dec 16 12:13:08.458286 containerd[1662]: time="2025-12-16T12:13:08.458231851Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-585b8759c9-djtxq,Uid:fd2217b9-d056-43fe-a117-755ac4fcec22,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"af941778fb37d071641033931d461878edcde1bf3915869aaabf2ed8f69fe199\"" Dec 16 12:13:08.746180 containerd[1662]: time="2025-12-16T12:13:08.746075012Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:13:08.747356 containerd[1662]: time="2025-12-16T12:13:08.747312176Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:13:08.747431 containerd[1662]: time="2025-12-16T12:13:08.747372136Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:13:08.747648 kubelet[2916]: E1216 12:13:08.747591 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:13:08.748037 kubelet[2916]: E1216 12:13:08.747672 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:13:08.748037 kubelet[2916]: E1216 12:13:08.747890 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5cprm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-7qscd_calico-system(a0cf5bad-dc9b-4b29-b409-00ae25b08c7c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:13:08.748387 containerd[1662]: time="2025-12-16T12:13:08.748248419Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:13:08.749649 kubelet[2916]: E1216 12:13:08.749594 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-7qscd" podUID="a0cf5bad-dc9b-4b29-b409-00ae25b08c7c" Dec 16 12:13:09.065439 containerd[1662]: time="2025-12-16T12:13:09.065286021Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cfgbh,Uid:15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4,Namespace:calico-system,Attempt:0,}" Dec 16 12:13:09.084448 containerd[1662]: time="2025-12-16T12:13:09.084268794Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:13:09.085852 containerd[1662]: time="2025-12-16T12:13:09.085735598Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:13:09.086794 containerd[1662]: time="2025-12-16T12:13:09.085794799Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:13:09.087647 kubelet[2916]: E1216 12:13:09.087429 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:13:09.087962 kubelet[2916]: E1216 12:13:09.087889 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:13:09.088644 kubelet[2916]: E1216 12:13:09.088430 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jnw7g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-585b8759c9-djtxq_calico-apiserver(fd2217b9-d056-43fe-a117-755ac4fcec22): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:13:09.089667 kubelet[2916]: E1216 12:13:09.089594 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-djtxq" podUID="fd2217b9-d056-43fe-a117-755ac4fcec22" Dec 16 12:13:09.183448 systemd-networkd[1584]: cali96fd76005c4: Link UP Dec 16 12:13:09.184179 systemd-networkd[1584]: cali96fd76005c4: Gained carrier Dec 16 12:13:09.199718 containerd[1662]: 2025-12-16 12:13:09.114 [INFO][5032] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4547--0--0--4--c6e23b3406-k8s-csi--node--driver--cfgbh-eth0 csi-node-driver- calico-system 15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4 725 0 2025-12-16 12:12:44 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:857b56db8f k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4547-0-0-4-c6e23b3406 csi-node-driver-cfgbh eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali96fd76005c4 [] [] }} ContainerID="ad27720b5e0081fb9c73dd09d5405b930b5304e9549e2424ff161242c7eb4599" Namespace="calico-system" Pod="csi-node-driver-cfgbh" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-csi--node--driver--cfgbh-" Dec 16 12:13:09.199718 containerd[1662]: 2025-12-16 12:13:09.115 [INFO][5032] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ad27720b5e0081fb9c73dd09d5405b930b5304e9549e2424ff161242c7eb4599" Namespace="calico-system" Pod="csi-node-driver-cfgbh" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-csi--node--driver--cfgbh-eth0" Dec 16 12:13:09.199718 containerd[1662]: 2025-12-16 12:13:09.139 [INFO][5046] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ad27720b5e0081fb9c73dd09d5405b930b5304e9549e2424ff161242c7eb4599" HandleID="k8s-pod-network.ad27720b5e0081fb9c73dd09d5405b930b5304e9549e2424ff161242c7eb4599" Workload="ci--4547--0--0--4--c6e23b3406-k8s-csi--node--driver--cfgbh-eth0" Dec 16 12:13:09.199718 containerd[1662]: 2025-12-16 12:13:09.139 [INFO][5046] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="ad27720b5e0081fb9c73dd09d5405b930b5304e9549e2424ff161242c7eb4599" HandleID="k8s-pod-network.ad27720b5e0081fb9c73dd09d5405b930b5304e9549e2424ff161242c7eb4599" Workload="ci--4547--0--0--4--c6e23b3406-k8s-csi--node--driver--cfgbh-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400051ea80), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4547-0-0-4-c6e23b3406", "pod":"csi-node-driver-cfgbh", "timestamp":"2025-12-16 12:13:09.139142427 +0000 UTC"}, Hostname:"ci-4547-0-0-4-c6e23b3406", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:13:09.199718 containerd[1662]: 2025-12-16 12:13:09.139 [INFO][5046] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:13:09.199718 containerd[1662]: 2025-12-16 12:13:09.139 [INFO][5046] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:13:09.199718 containerd[1662]: 2025-12-16 12:13:09.139 [INFO][5046] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4547-0-0-4-c6e23b3406' Dec 16 12:13:09.199718 containerd[1662]: 2025-12-16 12:13:09.149 [INFO][5046] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ad27720b5e0081fb9c73dd09d5405b930b5304e9549e2424ff161242c7eb4599" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:09.199718 containerd[1662]: 2025-12-16 12:13:09.154 [INFO][5046] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:09.199718 containerd[1662]: 2025-12-16 12:13:09.160 [INFO][5046] ipam/ipam.go 511: Trying affinity for 192.168.63.192/26 host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:09.199718 containerd[1662]: 2025-12-16 12:13:09.162 [INFO][5046] ipam/ipam.go 158: Attempting to load block cidr=192.168.63.192/26 host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:09.199718 containerd[1662]: 2025-12-16 12:13:09.165 [INFO][5046] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.63.192/26 host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:09.199718 containerd[1662]: 2025-12-16 12:13:09.165 [INFO][5046] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.63.192/26 handle="k8s-pod-network.ad27720b5e0081fb9c73dd09d5405b930b5304e9549e2424ff161242c7eb4599" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:09.199718 containerd[1662]: 2025-12-16 12:13:09.167 [INFO][5046] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.ad27720b5e0081fb9c73dd09d5405b930b5304e9549e2424ff161242c7eb4599 Dec 16 12:13:09.199718 containerd[1662]: 2025-12-16 12:13:09.171 [INFO][5046] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.63.192/26 handle="k8s-pod-network.ad27720b5e0081fb9c73dd09d5405b930b5304e9549e2424ff161242c7eb4599" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:09.199718 containerd[1662]: 2025-12-16 12:13:09.178 [INFO][5046] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.63.200/26] block=192.168.63.192/26 handle="k8s-pod-network.ad27720b5e0081fb9c73dd09d5405b930b5304e9549e2424ff161242c7eb4599" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:09.199718 containerd[1662]: 2025-12-16 12:13:09.178 [INFO][5046] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.63.200/26] handle="k8s-pod-network.ad27720b5e0081fb9c73dd09d5405b930b5304e9549e2424ff161242c7eb4599" host="ci-4547-0-0-4-c6e23b3406" Dec 16 12:13:09.199718 containerd[1662]: 2025-12-16 12:13:09.178 [INFO][5046] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:13:09.199718 containerd[1662]: 2025-12-16 12:13:09.178 [INFO][5046] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.63.200/26] IPv6=[] ContainerID="ad27720b5e0081fb9c73dd09d5405b930b5304e9549e2424ff161242c7eb4599" HandleID="k8s-pod-network.ad27720b5e0081fb9c73dd09d5405b930b5304e9549e2424ff161242c7eb4599" Workload="ci--4547--0--0--4--c6e23b3406-k8s-csi--node--driver--cfgbh-eth0" Dec 16 12:13:09.200260 containerd[1662]: 2025-12-16 12:13:09.180 [INFO][5032] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ad27720b5e0081fb9c73dd09d5405b930b5304e9549e2424ff161242c7eb4599" Namespace="calico-system" Pod="csi-node-driver-cfgbh" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-csi--node--driver--cfgbh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--4--c6e23b3406-k8s-csi--node--driver--cfgbh-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4", ResourceVersion:"725", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 12, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-4-c6e23b3406", ContainerID:"", Pod:"csi-node-driver-cfgbh", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.63.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali96fd76005c4", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:13:09.200260 containerd[1662]: 2025-12-16 12:13:09.181 [INFO][5032] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.63.200/32] ContainerID="ad27720b5e0081fb9c73dd09d5405b930b5304e9549e2424ff161242c7eb4599" Namespace="calico-system" Pod="csi-node-driver-cfgbh" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-csi--node--driver--cfgbh-eth0" Dec 16 12:13:09.200260 containerd[1662]: 2025-12-16 12:13:09.181 [INFO][5032] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali96fd76005c4 ContainerID="ad27720b5e0081fb9c73dd09d5405b930b5304e9549e2424ff161242c7eb4599" Namespace="calico-system" Pod="csi-node-driver-cfgbh" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-csi--node--driver--cfgbh-eth0" Dec 16 12:13:09.200260 containerd[1662]: 2025-12-16 12:13:09.183 [INFO][5032] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ad27720b5e0081fb9c73dd09d5405b930b5304e9549e2424ff161242c7eb4599" Namespace="calico-system" Pod="csi-node-driver-cfgbh" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-csi--node--driver--cfgbh-eth0" Dec 16 12:13:09.200260 containerd[1662]: 2025-12-16 12:13:09.183 [INFO][5032] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ad27720b5e0081fb9c73dd09d5405b930b5304e9549e2424ff161242c7eb4599" Namespace="calico-system" Pod="csi-node-driver-cfgbh" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-csi--node--driver--cfgbh-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4547--0--0--4--c6e23b3406-k8s-csi--node--driver--cfgbh-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4", ResourceVersion:"725", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 12, 44, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"857b56db8f", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4547-0-0-4-c6e23b3406", ContainerID:"ad27720b5e0081fb9c73dd09d5405b930b5304e9549e2424ff161242c7eb4599", Pod:"csi-node-driver-cfgbh", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.63.200/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali96fd76005c4", MAC:"16:9a:e3:96:d0:9e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:13:09.200260 containerd[1662]: 2025-12-16 12:13:09.197 [INFO][5032] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ad27720b5e0081fb9c73dd09d5405b930b5304e9549e2424ff161242c7eb4599" Namespace="calico-system" Pod="csi-node-driver-cfgbh" WorkloadEndpoint="ci--4547--0--0--4--c6e23b3406-k8s-csi--node--driver--cfgbh-eth0" Dec 16 12:13:09.211000 audit[5063]: NETFILTER_CFG table=filter:143 family=2 entries=60 op=nft_register_chain pid=5063 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:13:09.211000 audit[5063]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=26704 a0=3 a1=ffffc938c520 a2=0 a3=ffffa88d2fa8 items=0 ppid=4292 pid=5063 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:09.211000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:13:09.222095 containerd[1662]: time="2025-12-16T12:13:09.222054138Z" level=info msg="connecting to shim ad27720b5e0081fb9c73dd09d5405b930b5304e9549e2424ff161242c7eb4599" address="unix:///run/containerd/s/ca0112a9a3fd5bc2b43771340ad9126c2683db6c5b74cfdbc37a1ed0d4e30cf3" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:13:09.234640 kubelet[2916]: E1216 12:13:09.234578 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-djtxq" podUID="fd2217b9-d056-43fe-a117-755ac4fcec22" Dec 16 12:13:09.236097 kubelet[2916]: E1216 12:13:09.235996 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-7qscd" podUID="a0cf5bad-dc9b-4b29-b409-00ae25b08c7c" Dec 16 12:13:09.260888 systemd[1]: Started cri-containerd-ad27720b5e0081fb9c73dd09d5405b930b5304e9549e2424ff161242c7eb4599.scope - libcontainer container ad27720b5e0081fb9c73dd09d5405b930b5304e9549e2424ff161242c7eb4599. Dec 16 12:13:09.259000 audit[5095]: NETFILTER_CFG table=filter:144 family=2 entries=14 op=nft_register_rule pid=5095 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:13:09.259000 audit[5095]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffe141cd40 a2=0 a3=1 items=0 ppid=3060 pid=5095 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:09.259000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:13:09.268000 audit[5095]: NETFILTER_CFG table=nat:145 family=2 entries=20 op=nft_register_rule pid=5095 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:13:09.268000 audit[5095]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffe141cd40 a2=0 a3=1 items=0 ppid=3060 pid=5095 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:09.268000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:13:09.273000 audit: BPF prog-id=251 op=LOAD Dec 16 12:13:09.274000 audit: BPF prog-id=252 op=LOAD Dec 16 12:13:09.274000 audit[5082]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=5071 pid=5082 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:09.274000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6164323737323062356530303831666239633733646430396435343035 Dec 16 12:13:09.274000 audit: BPF prog-id=252 op=UNLOAD Dec 16 12:13:09.274000 audit[5082]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5071 pid=5082 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:09.274000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6164323737323062356530303831666239633733646430396435343035 Dec 16 12:13:09.274000 audit: BPF prog-id=253 op=LOAD Dec 16 12:13:09.274000 audit[5082]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=5071 pid=5082 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:09.274000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6164323737323062356530303831666239633733646430396435343035 Dec 16 12:13:09.275000 audit: BPF prog-id=254 op=LOAD Dec 16 12:13:09.275000 audit[5082]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=5071 pid=5082 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:09.275000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6164323737323062356530303831666239633733646430396435343035 Dec 16 12:13:09.275000 audit: BPF prog-id=254 op=UNLOAD Dec 16 12:13:09.275000 audit[5082]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=5071 pid=5082 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:09.275000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6164323737323062356530303831666239633733646430396435343035 Dec 16 12:13:09.275000 audit: BPF prog-id=253 op=UNLOAD Dec 16 12:13:09.275000 audit[5082]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=5071 pid=5082 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:09.275000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6164323737323062356530303831666239633733646430396435343035 Dec 16 12:13:09.275000 audit: BPF prog-id=255 op=LOAD Dec 16 12:13:09.275000 audit[5082]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=5071 pid=5082 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:09.275000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6164323737323062356530303831666239633733646430396435343035 Dec 16 12:13:09.285000 audit[5104]: NETFILTER_CFG table=filter:146 family=2 entries=14 op=nft_register_rule pid=5104 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:13:09.285000 audit[5104]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffe63d0330 a2=0 a3=1 items=0 ppid=3060 pid=5104 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:09.285000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:13:09.291000 audit[5104]: NETFILTER_CFG table=nat:147 family=2 entries=20 op=nft_register_rule pid=5104 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:13:09.291000 audit[5104]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffe63d0330 a2=0 a3=1 items=0 ppid=3060 pid=5104 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:13:09.291000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:13:09.296023 containerd[1662]: time="2025-12-16T12:13:09.295986304Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-cfgbh,Uid:15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4,Namespace:calico-system,Attempt:0,} returns sandbox id \"ad27720b5e0081fb9c73dd09d5405b930b5304e9549e2424ff161242c7eb4599\"" Dec 16 12:13:09.297997 containerd[1662]: time="2025-12-16T12:13:09.297564588Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:13:09.653170 containerd[1662]: time="2025-12-16T12:13:09.653105218Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:13:09.654338 containerd[1662]: time="2025-12-16T12:13:09.654292702Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:13:09.654544 containerd[1662]: time="2025-12-16T12:13:09.654380622Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:13:09.654576 kubelet[2916]: E1216 12:13:09.654520 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:13:09.654654 kubelet[2916]: E1216 12:13:09.654579 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:13:09.654793 kubelet[2916]: E1216 12:13:09.654733 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z4h5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-cfgbh_calico-system(15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:13:09.656845 containerd[1662]: time="2025-12-16T12:13:09.656680548Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:13:09.817045 systemd-networkd[1584]: cali0b316a3bae6: Gained IPv6LL Dec 16 12:13:09.817797 systemd-networkd[1584]: cali2b4e2c8fb5a: Gained IPv6LL Dec 16 12:13:10.009216 containerd[1662]: time="2025-12-16T12:13:10.008892689Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:13:10.010561 containerd[1662]: time="2025-12-16T12:13:10.010522254Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:13:10.010612 containerd[1662]: time="2025-12-16T12:13:10.010564414Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:13:10.010801 kubelet[2916]: E1216 12:13:10.010764 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:13:10.011054 kubelet[2916]: E1216 12:13:10.010812 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:13:10.011054 kubelet[2916]: E1216 12:13:10.010928 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z4h5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-cfgbh_calico-system(15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:13:10.012129 kubelet[2916]: E1216 12:13:10.012080 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-cfgbh" podUID="15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4" Dec 16 12:13:10.239184 kubelet[2916]: E1216 12:13:10.239136 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-djtxq" podUID="fd2217b9-d056-43fe-a117-755ac4fcec22" Dec 16 12:13:10.239184 kubelet[2916]: E1216 12:13:10.238994 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-7qscd" podUID="a0cf5bad-dc9b-4b29-b409-00ae25b08c7c" Dec 16 12:13:10.240548 kubelet[2916]: E1216 12:13:10.240514 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-cfgbh" podUID="15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4" Dec 16 12:13:10.456934 systemd-networkd[1584]: cali96fd76005c4: Gained IPv6LL Dec 16 12:13:11.244774 kubelet[2916]: E1216 12:13:11.244575 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-cfgbh" podUID="15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4" Dec 16 12:13:18.065138 containerd[1662]: time="2025-12-16T12:13:18.065080445Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:13:18.420157 containerd[1662]: time="2025-12-16T12:13:18.420111034Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:13:18.421238 containerd[1662]: time="2025-12-16T12:13:18.421208397Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:13:18.421301 containerd[1662]: time="2025-12-16T12:13:18.421235437Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:13:18.421431 kubelet[2916]: E1216 12:13:18.421398 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:13:18.421763 kubelet[2916]: E1216 12:13:18.421445 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:13:18.421763 kubelet[2916]: E1216 12:13:18.421543 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:62693b2085a94b8ea373cfdd88d47738,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9842c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5996544d9f-ldq7h_calico-system(9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:13:18.423535 containerd[1662]: time="2025-12-16T12:13:18.423509563Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:13:18.785165 containerd[1662]: time="2025-12-16T12:13:18.784803369Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:13:18.786793 containerd[1662]: time="2025-12-16T12:13:18.786674495Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:13:18.786793 containerd[1662]: time="2025-12-16T12:13:18.786730535Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:13:18.786974 kubelet[2916]: E1216 12:13:18.786931 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:13:18.787014 kubelet[2916]: E1216 12:13:18.786987 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:13:18.787689 kubelet[2916]: E1216 12:13:18.787123 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9842c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5996544d9f-ldq7h_calico-system(9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:13:18.788904 kubelet[2916]: E1216 12:13:18.788836 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5996544d9f-ldq7h" podUID="9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb" Dec 16 12:13:19.066724 containerd[1662]: time="2025-12-16T12:13:19.066378434Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:13:19.430523 containerd[1662]: time="2025-12-16T12:13:19.430425447Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:13:19.431645 containerd[1662]: time="2025-12-16T12:13:19.431593411Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:13:19.431694 containerd[1662]: time="2025-12-16T12:13:19.431651091Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:13:19.431867 kubelet[2916]: E1216 12:13:19.431806 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:13:19.432140 kubelet[2916]: E1216 12:13:19.431870 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:13:19.432140 kubelet[2916]: E1216 12:13:19.431997 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xgkkb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-5d8494b5b4-8w8m7_calico-system(9aa83703-a98a-4b61-a3ff-9d2f2ed50abd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:13:19.433475 kubelet[2916]: E1216 12:13:19.433428 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8494b5b4-8w8m7" podUID="9aa83703-a98a-4b61-a3ff-9d2f2ed50abd" Dec 16 12:13:22.065274 containerd[1662]: time="2025-12-16T12:13:22.065227585Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:13:22.401710 containerd[1662]: time="2025-12-16T12:13:22.401647082Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:13:22.403671 containerd[1662]: time="2025-12-16T12:13:22.403582727Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:13:22.403671 containerd[1662]: time="2025-12-16T12:13:22.403625087Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:13:22.403923 kubelet[2916]: E1216 12:13:22.403859 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:13:22.403923 kubelet[2916]: E1216 12:13:22.403913 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:13:22.404446 kubelet[2916]: E1216 12:13:22.404403 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hsckl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-585b8759c9-ggqjc_calico-apiserver(e4dab5b6-c490-4927-98c0-45033692e43c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:13:22.405729 kubelet[2916]: E1216 12:13:22.405640 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-ggqjc" podUID="e4dab5b6-c490-4927-98c0-45033692e43c" Dec 16 12:13:23.065373 containerd[1662]: time="2025-12-16T12:13:23.065311050Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:13:23.411251 containerd[1662]: time="2025-12-16T12:13:23.411172773Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:13:23.413081 containerd[1662]: time="2025-12-16T12:13:23.412900098Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:13:23.413081 containerd[1662]: time="2025-12-16T12:13:23.412940538Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:13:23.413264 kubelet[2916]: E1216 12:13:23.413192 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:13:23.413554 kubelet[2916]: E1216 12:13:23.413271 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:13:23.414213 kubelet[2916]: E1216 12:13:23.413761 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z4h5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-cfgbh_calico-system(15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:13:23.417247 containerd[1662]: time="2025-12-16T12:13:23.417196910Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:13:23.750731 containerd[1662]: time="2025-12-16T12:13:23.750253398Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:13:23.752119 containerd[1662]: time="2025-12-16T12:13:23.752059083Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:13:23.752197 containerd[1662]: time="2025-12-16T12:13:23.752128883Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:13:23.752352 kubelet[2916]: E1216 12:13:23.752310 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:13:23.752407 kubelet[2916]: E1216 12:13:23.752365 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:13:23.752540 kubelet[2916]: E1216 12:13:23.752480 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z4h5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-cfgbh_calico-system(15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:13:23.753730 kubelet[2916]: E1216 12:13:23.753682 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-cfgbh" podUID="15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4" Dec 16 12:13:25.066435 containerd[1662]: time="2025-12-16T12:13:25.066396103Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:13:25.414069 containerd[1662]: time="2025-12-16T12:13:25.413937551Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:13:25.415540 containerd[1662]: time="2025-12-16T12:13:25.415508595Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:13:25.415691 containerd[1662]: time="2025-12-16T12:13:25.415596875Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:13:25.415771 kubelet[2916]: E1216 12:13:25.415734 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:13:25.416195 kubelet[2916]: E1216 12:13:25.415783 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:13:25.416400 kubelet[2916]: E1216 12:13:25.416156 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jnw7g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-585b8759c9-djtxq_calico-apiserver(fd2217b9-d056-43fe-a117-755ac4fcec22): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:13:25.416510 containerd[1662]: time="2025-12-16T12:13:25.416310037Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:13:25.417441 kubelet[2916]: E1216 12:13:25.417398 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-djtxq" podUID="fd2217b9-d056-43fe-a117-755ac4fcec22" Dec 16 12:13:25.796279 containerd[1662]: time="2025-12-16T12:13:25.796143815Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:13:25.797714 containerd[1662]: time="2025-12-16T12:13:25.797592419Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:13:25.797714 containerd[1662]: time="2025-12-16T12:13:25.797649379Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:13:25.798354 kubelet[2916]: E1216 12:13:25.797847 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:13:25.798354 kubelet[2916]: E1216 12:13:25.797904 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:13:25.798354 kubelet[2916]: E1216 12:13:25.798036 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5cprm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-7qscd_calico-system(a0cf5bad-dc9b-4b29-b409-00ae25b08c7c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:13:25.799609 kubelet[2916]: E1216 12:13:25.799535 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-7qscd" podUID="a0cf5bad-dc9b-4b29-b409-00ae25b08c7c" Dec 16 12:13:30.065893 kubelet[2916]: E1216 12:13:30.065832 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5996544d9f-ldq7h" podUID="9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb" Dec 16 12:13:34.065739 kubelet[2916]: E1216 12:13:34.065670 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8494b5b4-8w8m7" podUID="9aa83703-a98a-4b61-a3ff-9d2f2ed50abd" Dec 16 12:13:37.067843 kubelet[2916]: E1216 12:13:37.067787 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-djtxq" podUID="fd2217b9-d056-43fe-a117-755ac4fcec22" Dec 16 12:13:37.068863 kubelet[2916]: E1216 12:13:37.068415 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-ggqjc" podUID="e4dab5b6-c490-4927-98c0-45033692e43c" Dec 16 12:13:39.068711 kubelet[2916]: E1216 12:13:39.068657 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-cfgbh" podUID="15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4" Dec 16 12:13:40.065549 kubelet[2916]: E1216 12:13:40.065484 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-7qscd" podUID="a0cf5bad-dc9b-4b29-b409-00ae25b08c7c" Dec 16 12:13:42.066394 containerd[1662]: time="2025-12-16T12:13:42.066269846Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:13:42.433836 containerd[1662]: time="2025-12-16T12:13:42.433793630Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:13:42.435197 containerd[1662]: time="2025-12-16T12:13:42.435124593Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:13:42.435390 containerd[1662]: time="2025-12-16T12:13:42.435157073Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:13:42.435600 kubelet[2916]: E1216 12:13:42.435550 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:13:42.435974 kubelet[2916]: E1216 12:13:42.435636 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:13:42.436120 kubelet[2916]: E1216 12:13:42.435938 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:62693b2085a94b8ea373cfdd88d47738,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9842c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5996544d9f-ldq7h_calico-system(9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:13:42.438362 containerd[1662]: time="2025-12-16T12:13:42.438337322Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:13:42.791419 containerd[1662]: time="2025-12-16T12:13:42.791303505Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:13:42.793295 containerd[1662]: time="2025-12-16T12:13:42.793255391Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:13:42.793392 containerd[1662]: time="2025-12-16T12:13:42.793342871Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:13:42.794118 kubelet[2916]: E1216 12:13:42.793524 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:13:42.794118 kubelet[2916]: E1216 12:13:42.793578 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:13:42.794118 kubelet[2916]: E1216 12:13:42.793707 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9842c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5996544d9f-ldq7h_calico-system(9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:13:42.795013 kubelet[2916]: E1216 12:13:42.794930 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5996544d9f-ldq7h" podUID="9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb" Dec 16 12:13:48.065521 containerd[1662]: time="2025-12-16T12:13:48.065479473Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:13:48.455475 containerd[1662]: time="2025-12-16T12:13:48.455362639Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:13:48.456796 containerd[1662]: time="2025-12-16T12:13:48.456746403Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:13:48.456905 containerd[1662]: time="2025-12-16T12:13:48.456823643Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:13:48.457048 kubelet[2916]: E1216 12:13:48.456987 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:13:48.457048 kubelet[2916]: E1216 12:13:48.457038 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:13:48.457484 kubelet[2916]: E1216 12:13:48.457278 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hsckl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-585b8759c9-ggqjc_calico-apiserver(e4dab5b6-c490-4927-98c0-45033692e43c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:13:48.457585 containerd[1662]: time="2025-12-16T12:13:48.457304325Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:13:48.458951 kubelet[2916]: E1216 12:13:48.458915 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-ggqjc" podUID="e4dab5b6-c490-4927-98c0-45033692e43c" Dec 16 12:13:48.797869 containerd[1662]: time="2025-12-16T12:13:48.797754233Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:13:48.799556 containerd[1662]: time="2025-12-16T12:13:48.799502478Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:13:48.799670 containerd[1662]: time="2025-12-16T12:13:48.799577398Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:13:48.799893 kubelet[2916]: E1216 12:13:48.799805 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:13:48.800007 kubelet[2916]: E1216 12:13:48.799988 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:13:48.800264 kubelet[2916]: E1216 12:13:48.800213 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xgkkb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-5d8494b5b4-8w8m7_calico-system(9aa83703-a98a-4b61-a3ff-9d2f2ed50abd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:13:48.801587 kubelet[2916]: E1216 12:13:48.801545 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8494b5b4-8w8m7" podUID="9aa83703-a98a-4b61-a3ff-9d2f2ed50abd" Dec 16 12:13:49.066152 containerd[1662]: time="2025-12-16T12:13:49.065518898Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:13:49.423224 containerd[1662]: time="2025-12-16T12:13:49.423147174Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:13:49.424810 containerd[1662]: time="2025-12-16T12:13:49.424767539Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:13:49.424888 containerd[1662]: time="2025-12-16T12:13:49.424832499Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:13:49.425012 kubelet[2916]: E1216 12:13:49.424972 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:13:49.425061 kubelet[2916]: E1216 12:13:49.425020 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:13:49.425190 kubelet[2916]: E1216 12:13:49.425149 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jnw7g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-585b8759c9-djtxq_calico-apiserver(fd2217b9-d056-43fe-a117-755ac4fcec22): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:13:49.426349 kubelet[2916]: E1216 12:13:49.426308 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-djtxq" podUID="fd2217b9-d056-43fe-a117-755ac4fcec22" Dec 16 12:13:53.065663 kubelet[2916]: E1216 12:13:53.065602 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5996544d9f-ldq7h" podUID="9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb" Dec 16 12:13:54.065798 containerd[1662]: time="2025-12-16T12:13:54.065680305Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:13:54.405277 containerd[1662]: time="2025-12-16T12:13:54.404893089Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:13:54.406526 containerd[1662]: time="2025-12-16T12:13:54.406400093Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:13:54.406526 containerd[1662]: time="2025-12-16T12:13:54.406426613Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:13:54.406700 kubelet[2916]: E1216 12:13:54.406662 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:13:54.407026 kubelet[2916]: E1216 12:13:54.406712 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:13:54.407085 kubelet[2916]: E1216 12:13:54.406932 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5cprm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-7qscd_calico-system(a0cf5bad-dc9b-4b29-b409-00ae25b08c7c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:13:54.407358 containerd[1662]: time="2025-12-16T12:13:54.407331256Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:13:54.408490 kubelet[2916]: E1216 12:13:54.408444 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-7qscd" podUID="a0cf5bad-dc9b-4b29-b409-00ae25b08c7c" Dec 16 12:13:54.751030 containerd[1662]: time="2025-12-16T12:13:54.750829013Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:13:54.752386 containerd[1662]: time="2025-12-16T12:13:54.752339617Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:13:54.752482 containerd[1662]: time="2025-12-16T12:13:54.752375137Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:13:54.752615 kubelet[2916]: E1216 12:13:54.752578 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:13:54.752723 kubelet[2916]: E1216 12:13:54.752649 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:13:54.752828 kubelet[2916]: E1216 12:13:54.752773 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z4h5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-cfgbh_calico-system(15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:13:54.754746 containerd[1662]: time="2025-12-16T12:13:54.754715823Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:13:55.076190 containerd[1662]: time="2025-12-16T12:13:55.076050838Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:13:55.077402 containerd[1662]: time="2025-12-16T12:13:55.077362122Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:13:55.077479 containerd[1662]: time="2025-12-16T12:13:55.077446122Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:13:55.077780 kubelet[2916]: E1216 12:13:55.077596 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:13:55.077780 kubelet[2916]: E1216 12:13:55.077665 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:13:55.077963 kubelet[2916]: E1216 12:13:55.077909 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z4h5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-cfgbh_calico-system(15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:13:55.079476 kubelet[2916]: E1216 12:13:55.079435 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-cfgbh" podUID="15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4" Dec 16 12:13:59.065619 kubelet[2916]: E1216 12:13:59.065213 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8494b5b4-8w8m7" podUID="9aa83703-a98a-4b61-a3ff-9d2f2ed50abd" Dec 16 12:14:03.066034 kubelet[2916]: E1216 12:14:03.065822 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-djtxq" podUID="fd2217b9-d056-43fe-a117-755ac4fcec22" Dec 16 12:14:04.066159 kubelet[2916]: E1216 12:14:04.066106 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5996544d9f-ldq7h" podUID="9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb" Dec 16 12:14:04.066591 kubelet[2916]: E1216 12:14:04.066254 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-ggqjc" podUID="e4dab5b6-c490-4927-98c0-45033692e43c" Dec 16 12:14:06.065277 kubelet[2916]: E1216 12:14:06.065231 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-7qscd" podUID="a0cf5bad-dc9b-4b29-b409-00ae25b08c7c" Dec 16 12:14:07.068469 kubelet[2916]: E1216 12:14:07.068414 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-cfgbh" podUID="15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4" Dec 16 12:14:11.065224 kubelet[2916]: E1216 12:14:11.065182 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8494b5b4-8w8m7" podUID="9aa83703-a98a-4b61-a3ff-9d2f2ed50abd" Dec 16 12:14:17.066677 kubelet[2916]: E1216 12:14:17.065759 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-ggqjc" podUID="e4dab5b6-c490-4927-98c0-45033692e43c" Dec 16 12:14:18.064919 kubelet[2916]: E1216 12:14:18.064850 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-7qscd" podUID="a0cf5bad-dc9b-4b29-b409-00ae25b08c7c" Dec 16 12:14:18.065098 kubelet[2916]: E1216 12:14:18.064928 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-djtxq" podUID="fd2217b9-d056-43fe-a117-755ac4fcec22" Dec 16 12:14:19.066693 kubelet[2916]: E1216 12:14:19.066311 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5996544d9f-ldq7h" podUID="9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb" Dec 16 12:14:19.066693 kubelet[2916]: E1216 12:14:19.066414 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-cfgbh" podUID="15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4" Dec 16 12:14:25.065406 kubelet[2916]: E1216 12:14:25.065298 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8494b5b4-8w8m7" podUID="9aa83703-a98a-4b61-a3ff-9d2f2ed50abd" Dec 16 12:14:26.995000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.21.106:22-139.178.68.195:38270 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:26.997473 systemd[1]: Started sshd@9-10.0.21.106:22-139.178.68.195:38270.service - OpenSSH per-connection server daemon (139.178.68.195:38270). Dec 16 12:14:27.001720 kernel: kauditd_printk_skb: 214 callbacks suppressed Dec 16 12:14:27.001820 kernel: audit: type=1130 audit(1765887266.995:747): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.21.106:22-139.178.68.195:38270 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:27.906000 audit[5219]: USER_ACCT pid=5219 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:27.908176 sshd[5219]: Accepted publickey for core from 139.178.68.195 port 38270 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:14:27.910926 sshd-session[5219]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:14:27.908000 audit[5219]: CRED_ACQ pid=5219 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:27.915504 systemd-logind[1644]: New session 11 of user core. Dec 16 12:14:27.916668 kernel: audit: type=1101 audit(1765887267.906:748): pid=5219 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:27.916757 kernel: audit: type=1103 audit(1765887267.908:749): pid=5219 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:27.916809 kernel: audit: type=1006 audit(1765887267.908:750): pid=5219 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Dec 16 12:14:27.908000 audit[5219]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc5c55e40 a2=3 a3=0 items=0 ppid=1 pid=5219 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:27.920897 systemd[1]: Started session-11.scope - Session 11 of User core. Dec 16 12:14:27.923182 kernel: audit: type=1300 audit(1765887267.908:750): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc5c55e40 a2=3 a3=0 items=0 ppid=1 pid=5219 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:27.908000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:14:27.924824 kernel: audit: type=1327 audit(1765887267.908:750): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:14:27.926000 audit[5219]: USER_START pid=5219 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:27.933685 kernel: audit: type=1105 audit(1765887267.926:751): pid=5219 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:27.933770 kernel: audit: type=1103 audit(1765887267.932:752): pid=5223 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:27.932000 audit[5223]: CRED_ACQ pid=5223 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:28.064973 kubelet[2916]: E1216 12:14:28.064901 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-ggqjc" podUID="e4dab5b6-c490-4927-98c0-45033692e43c" Dec 16 12:14:28.512723 sshd[5223]: Connection closed by 139.178.68.195 port 38270 Dec 16 12:14:28.512469 sshd-session[5219]: pam_unix(sshd:session): session closed for user core Dec 16 12:14:28.512000 audit[5219]: USER_END pid=5219 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:28.517269 systemd[1]: sshd@9-10.0.21.106:22-139.178.68.195:38270.service: Deactivated successfully. Dec 16 12:14:28.519038 systemd[1]: session-11.scope: Deactivated successfully. Dec 16 12:14:28.512000 audit[5219]: CRED_DISP pid=5219 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:28.523447 kernel: audit: type=1106 audit(1765887268.512:753): pid=5219 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:28.523541 kernel: audit: type=1104 audit(1765887268.512:754): pid=5219 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:28.515000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-10.0.21.106:22-139.178.68.195:38270 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:28.523837 systemd-logind[1644]: Session 11 logged out. Waiting for processes to exit. Dec 16 12:14:28.524895 systemd-logind[1644]: Removed session 11. Dec 16 12:14:31.067418 containerd[1662]: time="2025-12-16T12:14:31.067373391Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:14:31.067845 kubelet[2916]: E1216 12:14:31.067741 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-7qscd" podUID="a0cf5bad-dc9b-4b29-b409-00ae25b08c7c" Dec 16 12:14:31.068527 kubelet[2916]: E1216 12:14:31.067895 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-cfgbh" podUID="15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4" Dec 16 12:14:31.407731 containerd[1662]: time="2025-12-16T12:14:31.407603019Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:14:31.412710 containerd[1662]: time="2025-12-16T12:14:31.412648913Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:14:31.412859 containerd[1662]: time="2025-12-16T12:14:31.412733073Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:14:31.412959 kubelet[2916]: E1216 12:14:31.412914 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:14:31.413011 kubelet[2916]: E1216 12:14:31.412966 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:14:31.413684 kubelet[2916]: E1216 12:14:31.413185 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jnw7g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-585b8759c9-djtxq_calico-apiserver(fd2217b9-d056-43fe-a117-755ac4fcec22): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:14:31.413822 containerd[1662]: time="2025-12-16T12:14:31.413316635Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:14:31.414747 kubelet[2916]: E1216 12:14:31.414706 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-djtxq" podUID="fd2217b9-d056-43fe-a117-755ac4fcec22" Dec 16 12:14:31.756366 containerd[1662]: time="2025-12-16T12:14:31.756164150Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:14:31.758282 containerd[1662]: time="2025-12-16T12:14:31.758230675Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:14:31.758368 containerd[1662]: time="2025-12-16T12:14:31.758271515Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:14:31.758479 kubelet[2916]: E1216 12:14:31.758429 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:14:31.758656 kubelet[2916]: E1216 12:14:31.758478 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:14:31.758656 kubelet[2916]: E1216 12:14:31.758589 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:62693b2085a94b8ea373cfdd88d47738,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9842c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5996544d9f-ldq7h_calico-system(9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:14:31.760920 containerd[1662]: time="2025-12-16T12:14:31.760874283Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:14:32.102316 containerd[1662]: time="2025-12-16T12:14:32.102138033Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:14:32.103931 containerd[1662]: time="2025-12-16T12:14:32.103825518Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:14:32.103931 containerd[1662]: time="2025-12-16T12:14:32.103864758Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:14:32.104522 kubelet[2916]: E1216 12:14:32.104466 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:14:32.105070 kubelet[2916]: E1216 12:14:32.104608 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:14:32.105070 kubelet[2916]: E1216 12:14:32.104844 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9842c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5996544d9f-ldq7h_calico-system(9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:14:32.106414 kubelet[2916]: E1216 12:14:32.106358 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5996544d9f-ldq7h" podUID="9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb" Dec 16 12:14:33.694000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.21.106:22-139.178.68.195:41358 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:33.695653 systemd[1]: Started sshd@10-10.0.21.106:22-139.178.68.195:41358.service - OpenSSH per-connection server daemon (139.178.68.195:41358). Dec 16 12:14:33.699640 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:14:33.699746 kernel: audit: type=1130 audit(1765887273.694:756): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.21.106:22-139.178.68.195:41358 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:34.607000 audit[5257]: USER_ACCT pid=5257 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:34.609710 sshd[5257]: Accepted publickey for core from 139.178.68.195 port 41358 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:14:34.612026 sshd-session[5257]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:14:34.609000 audit[5257]: CRED_ACQ pid=5257 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:34.616987 kernel: audit: type=1101 audit(1765887274.607:757): pid=5257 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:34.617162 kernel: audit: type=1103 audit(1765887274.609:758): pid=5257 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:34.617187 kernel: audit: type=1006 audit(1765887274.609:759): pid=5257 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=12 res=1 Dec 16 12:14:34.619016 kernel: audit: type=1300 audit(1765887274.609:759): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffeef3fd80 a2=3 a3=0 items=0 ppid=1 pid=5257 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:34.609000 audit[5257]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffeef3fd80 a2=3 a3=0 items=0 ppid=1 pid=5257 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:34.620282 systemd-logind[1644]: New session 12 of user core. Dec 16 12:14:34.609000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:14:34.623407 kernel: audit: type=1327 audit(1765887274.609:759): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:14:34.629884 systemd[1]: Started session-12.scope - Session 12 of User core. Dec 16 12:14:34.630000 audit[5257]: USER_START pid=5257 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:34.632000 audit[5274]: CRED_ACQ pid=5274 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:34.639041 kernel: audit: type=1105 audit(1765887274.630:760): pid=5257 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:34.639109 kernel: audit: type=1103 audit(1765887274.632:761): pid=5274 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:35.188899 sshd[5274]: Connection closed by 139.178.68.195 port 41358 Dec 16 12:14:35.189435 sshd-session[5257]: pam_unix(sshd:session): session closed for user core Dec 16 12:14:35.188000 audit[5257]: USER_END pid=5257 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:35.193433 systemd[1]: sshd@10-10.0.21.106:22-139.178.68.195:41358.service: Deactivated successfully. Dec 16 12:14:35.189000 audit[5257]: CRED_DISP pid=5257 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:35.196943 systemd[1]: session-12.scope: Deactivated successfully. Dec 16 12:14:35.197566 kernel: audit: type=1106 audit(1765887275.188:762): pid=5257 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:35.197623 kernel: audit: type=1104 audit(1765887275.189:763): pid=5257 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:35.193000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-10.0.21.106:22-139.178.68.195:41358 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:35.197845 systemd-logind[1644]: Session 12 logged out. Waiting for processes to exit. Dec 16 12:14:35.198676 systemd-logind[1644]: Removed session 12. Dec 16 12:14:35.354000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.21.106:22-139.178.68.195:41364 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:35.355953 systemd[1]: Started sshd@11-10.0.21.106:22-139.178.68.195:41364.service - OpenSSH per-connection server daemon (139.178.68.195:41364). Dec 16 12:14:36.224000 audit[5289]: USER_ACCT pid=5289 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:36.226674 sshd[5289]: Accepted publickey for core from 139.178.68.195 port 41364 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:14:36.225000 audit[5289]: CRED_ACQ pid=5289 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:36.225000 audit[5289]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcb157850 a2=3 a3=0 items=0 ppid=1 pid=5289 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:36.225000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:14:36.228310 sshd-session[5289]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:14:36.232450 systemd-logind[1644]: New session 13 of user core. Dec 16 12:14:36.237863 systemd[1]: Started session-13.scope - Session 13 of User core. Dec 16 12:14:36.238000 audit[5289]: USER_START pid=5289 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:36.240000 audit[5293]: CRED_ACQ pid=5293 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:36.812882 sshd[5293]: Connection closed by 139.178.68.195 port 41364 Dec 16 12:14:36.813034 sshd-session[5289]: pam_unix(sshd:session): session closed for user core Dec 16 12:14:36.813000 audit[5289]: USER_END pid=5289 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:36.813000 audit[5289]: CRED_DISP pid=5289 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:36.818037 systemd[1]: sshd@11-10.0.21.106:22-139.178.68.195:41364.service: Deactivated successfully. Dec 16 12:14:36.816000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-10.0.21.106:22-139.178.68.195:41364 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:36.820354 systemd[1]: session-13.scope: Deactivated successfully. Dec 16 12:14:36.822485 systemd-logind[1644]: Session 13 logged out. Waiting for processes to exit. Dec 16 12:14:36.823918 systemd-logind[1644]: Removed session 13. Dec 16 12:14:36.998757 systemd[1]: Started sshd@12-10.0.21.106:22-139.178.68.195:41376.service - OpenSSH per-connection server daemon (139.178.68.195:41376). Dec 16 12:14:36.997000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.21.106:22-139.178.68.195:41376 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:37.066956 containerd[1662]: time="2025-12-16T12:14:37.066526498Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:14:37.413364 containerd[1662]: time="2025-12-16T12:14:37.413316944Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:14:37.582083 containerd[1662]: time="2025-12-16T12:14:37.582016774Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:14:37.582200 containerd[1662]: time="2025-12-16T12:14:37.582045094Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:14:37.582338 kubelet[2916]: E1216 12:14:37.582301 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:14:37.582670 kubelet[2916]: E1216 12:14:37.582350 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:14:37.582670 kubelet[2916]: E1216 12:14:37.582467 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xgkkb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-5d8494b5b4-8w8m7_calico-system(9aa83703-a98a-4b61-a3ff-9d2f2ed50abd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:14:37.583706 kubelet[2916]: E1216 12:14:37.583670 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8494b5b4-8w8m7" podUID="9aa83703-a98a-4b61-a3ff-9d2f2ed50abd" Dec 16 12:14:37.893768 sshd[5311]: Accepted publickey for core from 139.178.68.195 port 41376 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:14:37.892000 audit[5311]: USER_ACCT pid=5311 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:37.893000 audit[5311]: CRED_ACQ pid=5311 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:37.893000 audit[5311]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffed044040 a2=3 a3=0 items=0 ppid=1 pid=5311 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:37.893000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:14:37.895529 sshd-session[5311]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:14:37.900243 systemd-logind[1644]: New session 14 of user core. Dec 16 12:14:37.910984 systemd[1]: Started session-14.scope - Session 14 of User core. Dec 16 12:14:37.911000 audit[5311]: USER_START pid=5311 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:37.913000 audit[5340]: CRED_ACQ pid=5340 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:38.465193 sshd[5340]: Connection closed by 139.178.68.195 port 41376 Dec 16 12:14:38.465357 sshd-session[5311]: pam_unix(sshd:session): session closed for user core Dec 16 12:14:38.466000 audit[5311]: USER_END pid=5311 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:38.466000 audit[5311]: CRED_DISP pid=5311 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:38.470498 systemd[1]: sshd@12-10.0.21.106:22-139.178.68.195:41376.service: Deactivated successfully. Dec 16 12:14:38.469000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-10.0.21.106:22-139.178.68.195:41376 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:38.474301 systemd[1]: session-14.scope: Deactivated successfully. Dec 16 12:14:38.477332 systemd-logind[1644]: Session 14 logged out. Waiting for processes to exit. Dec 16 12:14:38.479557 systemd-logind[1644]: Removed session 14. Dec 16 12:14:39.066196 containerd[1662]: time="2025-12-16T12:14:39.065959387Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:14:39.416960 containerd[1662]: time="2025-12-16T12:14:39.416779324Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:14:39.916391 containerd[1662]: time="2025-12-16T12:14:39.916312555Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:14:39.916529 containerd[1662]: time="2025-12-16T12:14:39.916360315Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:14:39.916736 kubelet[2916]: E1216 12:14:39.916697 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:14:39.917003 kubelet[2916]: E1216 12:14:39.916751 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:14:39.917003 kubelet[2916]: E1216 12:14:39.916879 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hsckl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-585b8759c9-ggqjc_calico-apiserver(e4dab5b6-c490-4927-98c0-45033692e43c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:14:39.918344 kubelet[2916]: E1216 12:14:39.918309 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-ggqjc" podUID="e4dab5b6-c490-4927-98c0-45033692e43c" Dec 16 12:14:43.622174 systemd[1]: Started sshd@13-10.0.21.106:22-139.178.68.195:42272.service - OpenSSH per-connection server daemon (139.178.68.195:42272). Dec 16 12:14:43.620000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.21.106:22-139.178.68.195:42272 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:43.622957 kernel: kauditd_printk_skb: 23 callbacks suppressed Dec 16 12:14:43.622999 kernel: audit: type=1130 audit(1765887283.620:783): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.21.106:22-139.178.68.195:42272 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:44.065491 kubelet[2916]: E1216 12:14:44.065360 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-djtxq" podUID="fd2217b9-d056-43fe-a117-755ac4fcec22" Dec 16 12:14:44.433000 audit[5357]: USER_ACCT pid=5357 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:44.435710 sshd[5357]: Accepted publickey for core from 139.178.68.195 port 42272 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:14:44.438050 sshd-session[5357]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:14:44.435000 audit[5357]: CRED_ACQ pid=5357 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:44.442051 kernel: audit: type=1101 audit(1765887284.433:784): pid=5357 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:44.442131 kernel: audit: type=1103 audit(1765887284.435:785): pid=5357 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:44.442150 kernel: audit: type=1006 audit(1765887284.435:786): pid=5357 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=15 res=1 Dec 16 12:14:44.435000 audit[5357]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffc5aa040 a2=3 a3=0 items=0 ppid=1 pid=5357 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:44.445247 systemd-logind[1644]: New session 15 of user core. Dec 16 12:14:44.447541 kernel: audit: type=1300 audit(1765887284.435:786): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffc5aa040 a2=3 a3=0 items=0 ppid=1 pid=5357 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:44.447664 kernel: audit: type=1327 audit(1765887284.435:786): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:14:44.435000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:14:44.456951 systemd[1]: Started session-15.scope - Session 15 of User core. Dec 16 12:14:44.458000 audit[5357]: USER_START pid=5357 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:44.460000 audit[5361]: CRED_ACQ pid=5361 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:44.467668 kernel: audit: type=1105 audit(1765887284.458:787): pid=5357 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:44.467778 kernel: audit: type=1103 audit(1765887284.460:788): pid=5361 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:44.977875 sshd[5361]: Connection closed by 139.178.68.195 port 42272 Dec 16 12:14:44.978420 sshd-session[5357]: pam_unix(sshd:session): session closed for user core Dec 16 12:14:44.979000 audit[5357]: USER_END pid=5357 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:44.984931 systemd-logind[1644]: Session 15 logged out. Waiting for processes to exit. Dec 16 12:14:44.980000 audit[5357]: CRED_DISP pid=5357 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:44.988326 systemd[1]: sshd@13-10.0.21.106:22-139.178.68.195:42272.service: Deactivated successfully. Dec 16 12:14:44.988963 kernel: audit: type=1106 audit(1765887284.979:789): pid=5357 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:44.989027 kernel: audit: type=1104 audit(1765887284.980:790): pid=5357 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:44.988000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-10.0.21.106:22-139.178.68.195:42272 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:44.991515 systemd[1]: session-15.scope: Deactivated successfully. Dec 16 12:14:44.994180 systemd-logind[1644]: Removed session 15. Dec 16 12:14:45.066009 containerd[1662]: time="2025-12-16T12:14:45.065800976Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:14:45.403173 containerd[1662]: time="2025-12-16T12:14:45.403127435Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:14:45.405192 containerd[1662]: time="2025-12-16T12:14:45.405051521Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:14:45.405192 containerd[1662]: time="2025-12-16T12:14:45.405141681Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:14:45.405362 kubelet[2916]: E1216 12:14:45.405301 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:14:45.405362 kubelet[2916]: E1216 12:14:45.405345 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:14:45.405663 kubelet[2916]: E1216 12:14:45.405547 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5cprm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-7qscd_calico-system(a0cf5bad-dc9b-4b29-b409-00ae25b08c7c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:14:45.406118 containerd[1662]: time="2025-12-16T12:14:45.406088163Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:14:45.407111 kubelet[2916]: E1216 12:14:45.407074 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-7qscd" podUID="a0cf5bad-dc9b-4b29-b409-00ae25b08c7c" Dec 16 12:14:45.730991 containerd[1662]: time="2025-12-16T12:14:45.730766628Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:14:45.732270 containerd[1662]: time="2025-12-16T12:14:45.732232232Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:14:45.732345 containerd[1662]: time="2025-12-16T12:14:45.732258152Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:14:45.732516 kubelet[2916]: E1216 12:14:45.732474 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:14:45.732570 kubelet[2916]: E1216 12:14:45.732527 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:14:45.732725 kubelet[2916]: E1216 12:14:45.732659 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z4h5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-cfgbh_calico-system(15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:14:45.734700 containerd[1662]: time="2025-12-16T12:14:45.734492198Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:14:46.071560 containerd[1662]: time="2025-12-16T12:14:46.071447816Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:14:46.073173 containerd[1662]: time="2025-12-16T12:14:46.073038781Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:14:46.073260 containerd[1662]: time="2025-12-16T12:14:46.073116341Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:14:46.073603 kubelet[2916]: E1216 12:14:46.073387 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:14:46.073603 kubelet[2916]: E1216 12:14:46.073439 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:14:46.073603 kubelet[2916]: E1216 12:14:46.073555 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z4h5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-cfgbh_calico-system(15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:14:46.074876 kubelet[2916]: E1216 12:14:46.074842 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-cfgbh" podUID="15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4" Dec 16 12:14:47.065848 kubelet[2916]: E1216 12:14:47.065802 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5996544d9f-ldq7h" podUID="9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb" Dec 16 12:14:49.065134 kubelet[2916]: E1216 12:14:49.065042 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8494b5b4-8w8m7" podUID="9aa83703-a98a-4b61-a3ff-9d2f2ed50abd" Dec 16 12:14:50.145838 systemd[1]: Started sshd@14-10.0.21.106:22-139.178.68.195:42288.service - OpenSSH per-connection server daemon (139.178.68.195:42288). Dec 16 12:14:50.150224 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:14:50.150253 kernel: audit: type=1130 audit(1765887290.144:792): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.21.106:22-139.178.68.195:42288 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:50.144000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.21.106:22-139.178.68.195:42288 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:50.969000 audit[5375]: USER_ACCT pid=5375 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:50.971223 sshd[5375]: Accepted publickey for core from 139.178.68.195 port 42288 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:14:50.975856 sshd-session[5375]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:14:50.973000 audit[5375]: CRED_ACQ pid=5375 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:50.979106 kernel: audit: type=1101 audit(1765887290.969:793): pid=5375 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:50.979202 kernel: audit: type=1103 audit(1765887290.973:794): pid=5375 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:50.979220 kernel: audit: type=1006 audit(1765887290.973:795): pid=5375 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Dec 16 12:14:50.973000 audit[5375]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffea812220 a2=3 a3=0 items=0 ppid=1 pid=5375 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:50.985653 kernel: audit: type=1300 audit(1765887290.973:795): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffea812220 a2=3 a3=0 items=0 ppid=1 pid=5375 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:50.985724 kernel: audit: type=1327 audit(1765887290.973:795): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:14:50.973000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:14:50.988643 systemd-logind[1644]: New session 16 of user core. Dec 16 12:14:51.000033 systemd[1]: Started session-16.scope - Session 16 of User core. Dec 16 12:14:51.000000 audit[5375]: USER_START pid=5375 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:51.006658 kernel: audit: type=1105 audit(1765887291.000:796): pid=5375 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:51.007469 kernel: audit: type=1103 audit(1765887291.005:797): pid=5379 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:51.005000 audit[5379]: CRED_ACQ pid=5379 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:51.514420 sshd[5379]: Connection closed by 139.178.68.195 port 42288 Dec 16 12:14:51.514267 sshd-session[5375]: pam_unix(sshd:session): session closed for user core Dec 16 12:14:51.514000 audit[5375]: USER_END pid=5375 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:51.520398 systemd[1]: sshd@14-10.0.21.106:22-139.178.68.195:42288.service: Deactivated successfully. Dec 16 12:14:51.514000 audit[5375]: CRED_DISP pid=5375 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:51.522108 systemd[1]: session-16.scope: Deactivated successfully. Dec 16 12:14:51.522817 systemd-logind[1644]: Session 16 logged out. Waiting for processes to exit. Dec 16 12:14:51.523625 kernel: audit: type=1106 audit(1765887291.514:798): pid=5375 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:51.523681 kernel: audit: type=1104 audit(1765887291.514:799): pid=5375 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:51.518000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-10.0.21.106:22-139.178.68.195:42288 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:51.524556 systemd-logind[1644]: Removed session 16. Dec 16 12:14:53.064923 kubelet[2916]: E1216 12:14:53.064860 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-ggqjc" podUID="e4dab5b6-c490-4927-98c0-45033692e43c" Dec 16 12:14:56.065385 kubelet[2916]: E1216 12:14:56.065332 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-djtxq" podUID="fd2217b9-d056-43fe-a117-755ac4fcec22" Dec 16 12:14:56.699000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.21.106:22-139.178.68.195:53718 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:56.701259 systemd[1]: Started sshd@15-10.0.21.106:22-139.178.68.195:53718.service - OpenSSH per-connection server daemon (139.178.68.195:53718). Dec 16 12:14:56.702410 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:14:56.702460 kernel: audit: type=1130 audit(1765887296.699:801): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.21.106:22-139.178.68.195:53718 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:57.574000 audit[5394]: USER_ACCT pid=5394 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:57.576458 sshd[5394]: Accepted publickey for core from 139.178.68.195 port 53718 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:14:57.577000 audit[5394]: CRED_ACQ pid=5394 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:57.580466 sshd-session[5394]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:14:57.582743 kernel: audit: type=1101 audit(1765887297.574:802): pid=5394 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:57.582826 kernel: audit: type=1103 audit(1765887297.577:803): pid=5394 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:57.582999 kernel: audit: type=1006 audit(1765887297.578:804): pid=5394 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=17 res=1 Dec 16 12:14:57.584539 kernel: audit: type=1300 audit(1765887297.578:804): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc51c9c20 a2=3 a3=0 items=0 ppid=1 pid=5394 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:57.578000 audit[5394]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc51c9c20 a2=3 a3=0 items=0 ppid=1 pid=5394 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:57.587686 systemd-logind[1644]: New session 17 of user core. Dec 16 12:14:57.588026 kernel: audit: type=1327 audit(1765887297.578:804): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:14:57.578000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:14:57.602117 systemd[1]: Started session-17.scope - Session 17 of User core. Dec 16 12:14:57.603000 audit[5394]: USER_START pid=5394 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:57.605000 audit[5398]: CRED_ACQ pid=5398 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:57.612366 kernel: audit: type=1105 audit(1765887297.603:805): pid=5394 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:57.612453 kernel: audit: type=1103 audit(1765887297.605:806): pid=5398 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:58.066647 kubelet[2916]: E1216 12:14:58.066576 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-cfgbh" podUID="15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4" Dec 16 12:14:58.146256 sshd[5398]: Connection closed by 139.178.68.195 port 53718 Dec 16 12:14:58.146825 sshd-session[5394]: pam_unix(sshd:session): session closed for user core Dec 16 12:14:58.146000 audit[5394]: USER_END pid=5394 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:58.151203 systemd[1]: sshd@15-10.0.21.106:22-139.178.68.195:53718.service: Deactivated successfully. Dec 16 12:14:58.146000 audit[5394]: CRED_DISP pid=5394 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:58.154457 systemd[1]: session-17.scope: Deactivated successfully. Dec 16 12:14:58.155375 kernel: audit: type=1106 audit(1765887298.146:807): pid=5394 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:58.155429 kernel: audit: type=1104 audit(1765887298.146:808): pid=5394 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:58.150000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-10.0.21.106:22-139.178.68.195:53718 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:58.157130 systemd-logind[1644]: Session 17 logged out. Waiting for processes to exit. Dec 16 12:14:58.159009 systemd-logind[1644]: Removed session 17. Dec 16 12:14:58.334000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.21.106:22-139.178.68.195:53724 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:58.335608 systemd[1]: Started sshd@16-10.0.21.106:22-139.178.68.195:53724.service - OpenSSH per-connection server daemon (139.178.68.195:53724). Dec 16 12:14:59.068562 kubelet[2916]: E1216 12:14:59.068396 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5996544d9f-ldq7h" podUID="9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb" Dec 16 12:14:59.253000 audit[5412]: USER_ACCT pid=5412 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:59.255397 sshd[5412]: Accepted publickey for core from 139.178.68.195 port 53724 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:14:59.255000 audit[5412]: CRED_ACQ pid=5412 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:59.255000 audit[5412]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcf3088a0 a2=3 a3=0 items=0 ppid=1 pid=5412 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:14:59.255000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:14:59.258769 sshd-session[5412]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:14:59.267940 systemd-logind[1644]: New session 18 of user core. Dec 16 12:14:59.276149 systemd[1]: Started session-18.scope - Session 18 of User core. Dec 16 12:14:59.276000 audit[5412]: USER_START pid=5412 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:59.279000 audit[5416]: CRED_ACQ pid=5416 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:59.888508 sshd[5416]: Connection closed by 139.178.68.195 port 53724 Dec 16 12:14:59.888817 sshd-session[5412]: pam_unix(sshd:session): session closed for user core Dec 16 12:14:59.888000 audit[5412]: USER_END pid=5412 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:59.888000 audit[5412]: CRED_DISP pid=5412 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:14:59.893033 systemd[1]: sshd@16-10.0.21.106:22-139.178.68.195:53724.service: Deactivated successfully. Dec 16 12:14:59.891000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-10.0.21.106:22-139.178.68.195:53724 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:14:59.894970 systemd[1]: session-18.scope: Deactivated successfully. Dec 16 12:14:59.897565 systemd-logind[1644]: Session 18 logged out. Waiting for processes to exit. Dec 16 12:14:59.898280 systemd-logind[1644]: Removed session 18. Dec 16 12:15:00.064759 kubelet[2916]: E1216 12:15:00.064702 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-7qscd" podUID="a0cf5bad-dc9b-4b29-b409-00ae25b08c7c" Dec 16 12:15:00.077299 systemd[1]: Started sshd@17-10.0.21.106:22-139.178.68.195:53740.service - OpenSSH per-connection server daemon (139.178.68.195:53740). Dec 16 12:15:00.076000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.21.106:22-139.178.68.195:53740 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:00.993000 audit[5428]: USER_ACCT pid=5428 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:00.995263 sshd[5428]: Accepted publickey for core from 139.178.68.195 port 53740 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:15:00.994000 audit[5428]: CRED_ACQ pid=5428 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:00.994000 audit[5428]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffea2753e0 a2=3 a3=0 items=0 ppid=1 pid=5428 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:00.994000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:15:00.996866 sshd-session[5428]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:15:01.001573 systemd-logind[1644]: New session 19 of user core. Dec 16 12:15:01.010882 systemd[1]: Started session-19.scope - Session 19 of User core. Dec 16 12:15:01.012000 audit[5428]: USER_START pid=5428 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:01.014000 audit[5432]: CRED_ACQ pid=5432 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:01.065425 kubelet[2916]: E1216 12:15:01.065355 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8494b5b4-8w8m7" podUID="9aa83703-a98a-4b61-a3ff-9d2f2ed50abd" Dec 16 12:15:01.751000 audit[5444]: NETFILTER_CFG table=filter:148 family=2 entries=26 op=nft_register_rule pid=5444 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:01.754485 kernel: kauditd_printk_skb: 20 callbacks suppressed Dec 16 12:15:01.754564 kernel: audit: type=1325 audit(1765887301.751:825): table=filter:148 family=2 entries=26 op=nft_register_rule pid=5444 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:01.751000 audit[5444]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=ffffee0d1d10 a2=0 a3=1 items=0 ppid=3060 pid=5444 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:01.761562 kernel: audit: type=1300 audit(1765887301.751:825): arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=ffffee0d1d10 a2=0 a3=1 items=0 ppid=3060 pid=5444 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:01.761707 kernel: audit: type=1327 audit(1765887301.751:825): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:01.751000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:01.769000 audit[5444]: NETFILTER_CFG table=nat:149 family=2 entries=20 op=nft_register_rule pid=5444 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:01.769000 audit[5444]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffee0d1d10 a2=0 a3=1 items=0 ppid=3060 pid=5444 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:01.779133 kernel: audit: type=1325 audit(1765887301.769:826): table=nat:149 family=2 entries=20 op=nft_register_rule pid=5444 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:01.779209 kernel: audit: type=1300 audit(1765887301.769:826): arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffee0d1d10 a2=0 a3=1 items=0 ppid=3060 pid=5444 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:01.769000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:01.781675 kernel: audit: type=1327 audit(1765887301.769:826): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:01.841000 audit[5446]: NETFILTER_CFG table=filter:150 family=2 entries=38 op=nft_register_rule pid=5446 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:01.841000 audit[5446]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=fffff2c06480 a2=0 a3=1 items=0 ppid=3060 pid=5446 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:01.850319 kernel: audit: type=1325 audit(1765887301.841:827): table=filter:150 family=2 entries=38 op=nft_register_rule pid=5446 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:01.850383 kernel: audit: type=1300 audit(1765887301.841:827): arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=fffff2c06480 a2=0 a3=1 items=0 ppid=3060 pid=5446 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:01.850414 kernel: audit: type=1327 audit(1765887301.841:827): proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:01.841000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:01.849000 audit[5446]: NETFILTER_CFG table=nat:151 family=2 entries=20 op=nft_register_rule pid=5446 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:01.854138 kernel: audit: type=1325 audit(1765887301.849:828): table=nat:151 family=2 entries=20 op=nft_register_rule pid=5446 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:01.849000 audit[5446]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=fffff2c06480 a2=0 a3=1 items=0 ppid=3060 pid=5446 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:01.849000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:01.943265 sshd[5432]: Connection closed by 139.178.68.195 port 53740 Dec 16 12:15:01.943787 sshd-session[5428]: pam_unix(sshd:session): session closed for user core Dec 16 12:15:01.943000 audit[5428]: USER_END pid=5428 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:01.943000 audit[5428]: CRED_DISP pid=5428 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:01.948155 systemd[1]: sshd@17-10.0.21.106:22-139.178.68.195:53740.service: Deactivated successfully. Dec 16 12:15:01.946000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-10.0.21.106:22-139.178.68.195:53740 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:01.949960 systemd[1]: session-19.scope: Deactivated successfully. Dec 16 12:15:01.950988 systemd-logind[1644]: Session 19 logged out. Waiting for processes to exit. Dec 16 12:15:01.953903 systemd-logind[1644]: Removed session 19. Dec 16 12:15:02.130064 systemd[1]: Started sshd@18-10.0.21.106:22-139.178.68.195:45344.service - OpenSSH per-connection server daemon (139.178.68.195:45344). Dec 16 12:15:02.128000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.21.106:22-139.178.68.195:45344 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:03.051000 audit[5451]: USER_ACCT pid=5451 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:03.054177 sshd[5451]: Accepted publickey for core from 139.178.68.195 port 45344 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:15:03.053000 audit[5451]: CRED_ACQ pid=5451 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:03.053000 audit[5451]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe94c3150 a2=3 a3=0 items=0 ppid=1 pid=5451 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:03.053000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:15:03.055901 sshd-session[5451]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:15:03.060229 systemd-logind[1644]: New session 20 of user core. Dec 16 12:15:03.070057 systemd[1]: Started session-20.scope - Session 20 of User core. Dec 16 12:15:03.072000 audit[5451]: USER_START pid=5451 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:03.073000 audit[5455]: CRED_ACQ pid=5455 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:03.761480 sshd[5455]: Connection closed by 139.178.68.195 port 45344 Dec 16 12:15:03.762034 sshd-session[5451]: pam_unix(sshd:session): session closed for user core Dec 16 12:15:03.761000 audit[5451]: USER_END pid=5451 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:03.761000 audit[5451]: CRED_DISP pid=5451 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:03.766663 systemd-logind[1644]: Session 20 logged out. Waiting for processes to exit. Dec 16 12:15:03.767049 systemd[1]: sshd@18-10.0.21.106:22-139.178.68.195:45344.service: Deactivated successfully. Dec 16 12:15:03.767000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-10.0.21.106:22-139.178.68.195:45344 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:03.770335 systemd[1]: session-20.scope: Deactivated successfully. Dec 16 12:15:03.772239 systemd-logind[1644]: Removed session 20. Dec 16 12:15:03.932967 systemd[1]: Started sshd@19-10.0.21.106:22-139.178.68.195:45350.service - OpenSSH per-connection server daemon (139.178.68.195:45350). Dec 16 12:15:03.931000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.21.106:22-139.178.68.195:45350 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:04.065596 kubelet[2916]: E1216 12:15:04.065190 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-ggqjc" podUID="e4dab5b6-c490-4927-98c0-45033692e43c" Dec 16 12:15:04.804000 audit[5466]: USER_ACCT pid=5466 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:04.806602 sshd[5466]: Accepted publickey for core from 139.178.68.195 port 45350 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:15:04.805000 audit[5466]: CRED_ACQ pid=5466 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:04.805000 audit[5466]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe04afac0 a2=3 a3=0 items=0 ppid=1 pid=5466 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:04.805000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:15:04.808327 sshd-session[5466]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:15:04.812287 systemd-logind[1644]: New session 21 of user core. Dec 16 12:15:04.818844 systemd[1]: Started session-21.scope - Session 21 of User core. Dec 16 12:15:04.819000 audit[5466]: USER_START pid=5466 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:04.821000 audit[5470]: CRED_ACQ pid=5470 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:05.369693 sshd[5470]: Connection closed by 139.178.68.195 port 45350 Dec 16 12:15:05.370032 sshd-session[5466]: pam_unix(sshd:session): session closed for user core Dec 16 12:15:05.369000 audit[5466]: USER_END pid=5466 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:05.369000 audit[5466]: CRED_DISP pid=5466 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:05.374664 systemd-logind[1644]: Session 21 logged out. Waiting for processes to exit. Dec 16 12:15:05.375284 systemd[1]: session-21.scope: Deactivated successfully. Dec 16 12:15:05.377154 systemd[1]: sshd@19-10.0.21.106:22-139.178.68.195:45350.service: Deactivated successfully. Dec 16 12:15:05.375000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-10.0.21.106:22-139.178.68.195:45350 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:05.381552 systemd-logind[1644]: Removed session 21. Dec 16 12:15:06.634000 audit[5484]: NETFILTER_CFG table=filter:152 family=2 entries=26 op=nft_register_rule pid=5484 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:06.634000 audit[5484]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffd0a5dce0 a2=0 a3=1 items=0 ppid=3060 pid=5484 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:06.634000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:06.643000 audit[5484]: NETFILTER_CFG table=nat:153 family=2 entries=104 op=nft_register_chain pid=5484 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:15:06.643000 audit[5484]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=48684 a0=3 a1=ffffd0a5dce0 a2=0 a3=1 items=0 ppid=3060 pid=5484 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:06.643000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D5700313030303030002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:15:10.070971 kubelet[2916]: E1216 12:15:10.070911 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-cfgbh" podUID="15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4" Dec 16 12:15:10.550062 kernel: kauditd_printk_skb: 33 callbacks suppressed Dec 16 12:15:10.550175 kernel: audit: type=1130 audit(1765887310.546:852): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.21.106:22-139.178.68.195:40528 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:10.546000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.21.106:22-139.178.68.195:40528 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:10.547836 systemd[1]: Started sshd@20-10.0.21.106:22-139.178.68.195:40528.service - OpenSSH per-connection server daemon (139.178.68.195:40528). Dec 16 12:15:11.066504 kubelet[2916]: E1216 12:15:11.066453 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-djtxq" podUID="fd2217b9-d056-43fe-a117-755ac4fcec22" Dec 16 12:15:11.066504 kubelet[2916]: E1216 12:15:11.066491 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-7qscd" podUID="a0cf5bad-dc9b-4b29-b409-00ae25b08c7c" Dec 16 12:15:11.067354 kubelet[2916]: E1216 12:15:11.067300 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5996544d9f-ldq7h" podUID="9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb" Dec 16 12:15:11.389454 sshd[5510]: Accepted publickey for core from 139.178.68.195 port 40528 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:15:11.387000 audit[5510]: USER_ACCT pid=5510 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:11.391000 audit[5510]: CRED_ACQ pid=5510 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:11.394732 sshd-session[5510]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:15:11.396811 kernel: audit: type=1101 audit(1765887311.387:853): pid=5510 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:11.397158 kernel: audit: type=1103 audit(1765887311.391:854): pid=5510 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:11.397198 kernel: audit: type=1006 audit(1765887311.391:855): pid=5510 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=22 res=1 Dec 16 12:15:11.391000 audit[5510]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffd9f7680 a2=3 a3=0 items=0 ppid=1 pid=5510 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:11.402496 kernel: audit: type=1300 audit(1765887311.391:855): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffd9f7680 a2=3 a3=0 items=0 ppid=1 pid=5510 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:11.391000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:15:11.404369 kernel: audit: type=1327 audit(1765887311.391:855): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:15:11.406704 systemd-logind[1644]: New session 22 of user core. Dec 16 12:15:11.410832 systemd[1]: Started session-22.scope - Session 22 of User core. Dec 16 12:15:11.412000 audit[5510]: USER_START pid=5510 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:11.416000 audit[5514]: CRED_ACQ pid=5514 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:11.421498 kernel: audit: type=1105 audit(1765887311.412:856): pid=5510 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:11.421567 kernel: audit: type=1103 audit(1765887311.416:857): pid=5514 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:11.937278 sshd[5514]: Connection closed by 139.178.68.195 port 40528 Dec 16 12:15:11.938065 sshd-session[5510]: pam_unix(sshd:session): session closed for user core Dec 16 12:15:11.938000 audit[5510]: USER_END pid=5510 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:11.943119 systemd[1]: sshd@20-10.0.21.106:22-139.178.68.195:40528.service: Deactivated successfully. Dec 16 12:15:11.939000 audit[5510]: CRED_DISP pid=5510 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:11.944978 systemd[1]: session-22.scope: Deactivated successfully. Dec 16 12:15:11.946362 systemd-logind[1644]: Session 22 logged out. Waiting for processes to exit. Dec 16 12:15:11.947404 systemd-logind[1644]: Removed session 22. Dec 16 12:15:11.947954 kernel: audit: type=1106 audit(1765887311.938:858): pid=5510 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:11.948019 kernel: audit: type=1104 audit(1765887311.939:859): pid=5510 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:11.941000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-10.0.21.106:22-139.178.68.195:40528 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:15.065599 kubelet[2916]: E1216 12:15:15.065520 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8494b5b4-8w8m7" podUID="9aa83703-a98a-4b61-a3ff-9d2f2ed50abd" Dec 16 12:15:17.107594 systemd[1]: Started sshd@21-10.0.21.106:22-139.178.68.195:40540.service - OpenSSH per-connection server daemon (139.178.68.195:40540). Dec 16 12:15:17.106000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.21.106:22-139.178.68.195:40540 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:17.111154 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:15:17.111244 kernel: audit: type=1130 audit(1765887317.106:861): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.21.106:22-139.178.68.195:40540 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:17.943000 audit[5530]: USER_ACCT pid=5530 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:17.946108 sshd[5530]: Accepted publickey for core from 139.178.68.195 port 40540 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:15:17.949546 sshd-session[5530]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:15:17.947000 audit[5530]: CRED_ACQ pid=5530 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:17.953240 kernel: audit: type=1101 audit(1765887317.943:862): pid=5530 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:17.953317 kernel: audit: type=1103 audit(1765887317.947:863): pid=5530 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:17.956564 kernel: audit: type=1006 audit(1765887317.947:864): pid=5530 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=23 res=1 Dec 16 12:15:17.947000 audit[5530]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd26c0e80 a2=3 a3=0 items=0 ppid=1 pid=5530 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:17.960039 kernel: audit: type=1300 audit(1765887317.947:864): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd26c0e80 a2=3 a3=0 items=0 ppid=1 pid=5530 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=23 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:17.947000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:15:17.961498 kernel: audit: type=1327 audit(1765887317.947:864): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:15:17.963980 systemd-logind[1644]: New session 23 of user core. Dec 16 12:15:17.979881 systemd[1]: Started session-23.scope - Session 23 of User core. Dec 16 12:15:17.981000 audit[5530]: USER_START pid=5530 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:17.983000 audit[5534]: CRED_ACQ pid=5534 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:17.990560 kernel: audit: type=1105 audit(1765887317.981:865): pid=5530 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:17.990719 kernel: audit: type=1103 audit(1765887317.983:866): pid=5534 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:18.064980 kubelet[2916]: E1216 12:15:18.064938 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-ggqjc" podUID="e4dab5b6-c490-4927-98c0-45033692e43c" Dec 16 12:15:18.486777 sshd[5534]: Connection closed by 139.178.68.195 port 40540 Dec 16 12:15:18.487112 sshd-session[5530]: pam_unix(sshd:session): session closed for user core Dec 16 12:15:18.486000 audit[5530]: USER_END pid=5530 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:18.493187 systemd[1]: sshd@21-10.0.21.106:22-139.178.68.195:40540.service: Deactivated successfully. Dec 16 12:15:18.486000 audit[5530]: CRED_DISP pid=5530 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:18.496447 kernel: audit: type=1106 audit(1765887318.486:867): pid=5530 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:18.496541 kernel: audit: type=1104 audit(1765887318.486:868): pid=5530 uid=0 auid=500 ses=23 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:18.493000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-10.0.21.106:22-139.178.68.195:40540 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:18.497469 systemd[1]: session-23.scope: Deactivated successfully. Dec 16 12:15:18.500428 systemd-logind[1644]: Session 23 logged out. Waiting for processes to exit. Dec 16 12:15:18.501856 systemd-logind[1644]: Removed session 23. Dec 16 12:15:21.067856 kubelet[2916]: E1216 12:15:21.066950 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-cfgbh" podUID="15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4" Dec 16 12:15:23.677000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.21.106:22-139.178.68.195:50114 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:23.679014 systemd[1]: Started sshd@22-10.0.21.106:22-139.178.68.195:50114.service - OpenSSH per-connection server daemon (139.178.68.195:50114). Dec 16 12:15:23.680146 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:15:23.680200 kernel: audit: type=1130 audit(1765887323.677:870): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.21.106:22-139.178.68.195:50114 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:24.589000 audit[5548]: USER_ACCT pid=5548 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:24.592005 sshd[5548]: Accepted publickey for core from 139.178.68.195 port 50114 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:15:24.594698 kernel: audit: type=1101 audit(1765887324.589:871): pid=5548 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:24.594000 audit[5548]: CRED_ACQ pid=5548 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:24.596875 sshd-session[5548]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:15:24.602405 kernel: audit: type=1103 audit(1765887324.594:872): pid=5548 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:24.602468 kernel: audit: type=1006 audit(1765887324.594:873): pid=5548 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=24 res=1 Dec 16 12:15:24.602493 kernel: audit: type=1300 audit(1765887324.594:873): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcd500f40 a2=3 a3=0 items=0 ppid=1 pid=5548 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:24.594000 audit[5548]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcd500f40 a2=3 a3=0 items=0 ppid=1 pid=5548 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=24 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:24.602908 systemd-logind[1644]: New session 24 of user core. Dec 16 12:15:24.594000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:15:24.607057 kernel: audit: type=1327 audit(1765887324.594:873): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:15:24.611884 systemd[1]: Started session-24.scope - Session 24 of User core. Dec 16 12:15:24.612000 audit[5548]: USER_START pid=5548 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:24.619679 kernel: audit: type=1105 audit(1765887324.612:874): pid=5548 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:24.618000 audit[5552]: CRED_ACQ pid=5552 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:24.623674 kernel: audit: type=1103 audit(1765887324.618:875): pid=5552 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:25.067887 kubelet[2916]: E1216 12:15:25.067369 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-djtxq" podUID="fd2217b9-d056-43fe-a117-755ac4fcec22" Dec 16 12:15:25.067887 kubelet[2916]: E1216 12:15:25.067480 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-7qscd" podUID="a0cf5bad-dc9b-4b29-b409-00ae25b08c7c" Dec 16 12:15:25.199186 sshd[5552]: Connection closed by 139.178.68.195 port 50114 Dec 16 12:15:25.199425 sshd-session[5548]: pam_unix(sshd:session): session closed for user core Dec 16 12:15:25.200000 audit[5548]: USER_END pid=5548 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:25.204845 systemd[1]: sshd@22-10.0.21.106:22-139.178.68.195:50114.service: Deactivated successfully. Dec 16 12:15:25.200000 audit[5548]: CRED_DISP pid=5548 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:25.206543 systemd[1]: session-24.scope: Deactivated successfully. Dec 16 12:15:25.209563 kernel: audit: type=1106 audit(1765887325.200:876): pid=5548 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:25.209659 kernel: audit: type=1104 audit(1765887325.200:877): pid=5548 uid=0 auid=500 ses=24 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:25.203000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-10.0.21.106:22-139.178.68.195:50114 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:25.210127 systemd-logind[1644]: Session 24 logged out. Waiting for processes to exit. Dec 16 12:15:25.210928 systemd-logind[1644]: Removed session 24. Dec 16 12:15:26.065001 kubelet[2916]: E1216 12:15:26.064912 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8494b5b4-8w8m7" podUID="9aa83703-a98a-4b61-a3ff-9d2f2ed50abd" Dec 16 12:15:26.065547 kubelet[2916]: E1216 12:15:26.065412 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5996544d9f-ldq7h" podUID="9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb" Dec 16 12:15:30.367000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.21.106:22-139.178.68.195:58530 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:30.369367 systemd[1]: Started sshd@23-10.0.21.106:22-139.178.68.195:58530.service - OpenSSH per-connection server daemon (139.178.68.195:58530). Dec 16 12:15:30.373260 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:15:30.373377 kernel: audit: type=1130 audit(1765887330.367:879): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.21.106:22-139.178.68.195:58530 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:31.068047 kubelet[2916]: E1216 12:15:31.067341 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-ggqjc" podUID="e4dab5b6-c490-4927-98c0-45033692e43c" Dec 16 12:15:31.236000 audit[5568]: USER_ACCT pid=5568 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:31.241665 kernel: audit: type=1101 audit(1765887331.236:880): pid=5568 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:31.241878 sshd[5568]: Accepted publickey for core from 139.178.68.195 port 58530 ssh2: RSA SHA256:pBWSqQr4kol1h2kE8yzVkaN581ljcEsm2AsOS5SkkkM Dec 16 12:15:31.240000 audit[5568]: CRED_ACQ pid=5568 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:31.243171 sshd-session[5568]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:15:31.245724 kernel: audit: type=1103 audit(1765887331.240:881): pid=5568 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:31.245792 kernel: audit: type=1006 audit(1765887331.240:882): pid=5568 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=25 res=1 Dec 16 12:15:31.248137 kernel: audit: type=1300 audit(1765887331.240:882): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdcd85fa0 a2=3 a3=0 items=0 ppid=1 pid=5568 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:31.240000 audit[5568]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdcd85fa0 a2=3 a3=0 items=0 ppid=1 pid=5568 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=25 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:31.240000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:15:31.252662 kernel: audit: type=1327 audit(1765887331.240:882): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:15:31.254068 systemd-logind[1644]: New session 25 of user core. Dec 16 12:15:31.259835 systemd[1]: Started session-25.scope - Session 25 of User core. Dec 16 12:15:31.261000 audit[5568]: USER_START pid=5568 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:31.265000 audit[5572]: CRED_ACQ pid=5572 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:31.270324 kernel: audit: type=1105 audit(1765887331.261:883): pid=5568 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:31.270437 kernel: audit: type=1103 audit(1765887331.265:884): pid=5572 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:31.804801 sshd[5572]: Connection closed by 139.178.68.195 port 58530 Dec 16 12:15:31.805085 sshd-session[5568]: pam_unix(sshd:session): session closed for user core Dec 16 12:15:31.804000 audit[5568]: USER_END pid=5568 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:31.810915 systemd[1]: sshd@23-10.0.21.106:22-139.178.68.195:58530.service: Deactivated successfully. Dec 16 12:15:31.814220 systemd[1]: session-25.scope: Deactivated successfully. Dec 16 12:15:31.804000 audit[5568]: CRED_DISP pid=5568 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:31.817502 kernel: audit: type=1106 audit(1765887331.804:885): pid=5568 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:31.817601 kernel: audit: type=1104 audit(1765887331.804:886): pid=5568 uid=0 auid=500 ses=25 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.68.195 addr=139.178.68.195 terminal=ssh res=success' Dec 16 12:15:31.811000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@23-10.0.21.106:22-139.178.68.195:58530 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:15:31.819946 systemd-logind[1644]: Session 25 logged out. Waiting for processes to exit. Dec 16 12:15:31.823442 systemd-logind[1644]: Removed session 25. Dec 16 12:15:32.065287 kubelet[2916]: E1216 12:15:32.065163 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-cfgbh" podUID="15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4" Dec 16 12:15:37.065529 kubelet[2916]: E1216 12:15:37.065188 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-djtxq" podUID="fd2217b9-d056-43fe-a117-755ac4fcec22" Dec 16 12:15:37.065995 kubelet[2916]: E1216 12:15:37.065793 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5996544d9f-ldq7h" podUID="9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb" Dec 16 12:15:40.065361 kubelet[2916]: E1216 12:15:40.065028 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-7qscd" podUID="a0cf5bad-dc9b-4b29-b409-00ae25b08c7c" Dec 16 12:15:40.065361 kubelet[2916]: E1216 12:15:40.065087 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8494b5b4-8w8m7" podUID="9aa83703-a98a-4b61-a3ff-9d2f2ed50abd" Dec 16 12:15:44.065445 kubelet[2916]: E1216 12:15:44.065057 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-ggqjc" podUID="e4dab5b6-c490-4927-98c0-45033692e43c" Dec 16 12:15:47.064890 kubelet[2916]: E1216 12:15:47.064819 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-cfgbh" podUID="15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4" Dec 16 12:15:48.068797 kubelet[2916]: E1216 12:15:48.068525 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5996544d9f-ldq7h" podUID="9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb" Dec 16 12:15:51.065314 kubelet[2916]: E1216 12:15:51.065233 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8494b5b4-8w8m7" podUID="9aa83703-a98a-4b61-a3ff-9d2f2ed50abd" Dec 16 12:15:52.065188 kubelet[2916]: E1216 12:15:52.065089 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-7qscd" podUID="a0cf5bad-dc9b-4b29-b409-00ae25b08c7c" Dec 16 12:15:52.065613 containerd[1662]: time="2025-12-16T12:15:52.065588845Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:15:52.418026 containerd[1662]: time="2025-12-16T12:15:52.417965826Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:15:52.419289 containerd[1662]: time="2025-12-16T12:15:52.419238350Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:15:52.419478 containerd[1662]: time="2025-12-16T12:15:52.419335990Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:15:52.419528 kubelet[2916]: E1216 12:15:52.419459 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:15:52.419528 kubelet[2916]: E1216 12:15:52.419508 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:15:52.419909 kubelet[2916]: E1216 12:15:52.419662 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-jnw7g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-585b8759c9-djtxq_calico-apiserver(fd2217b9-d056-43fe-a117-755ac4fcec22): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:15:52.420873 kubelet[2916]: E1216 12:15:52.420829 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-djtxq" podUID="fd2217b9-d056-43fe-a117-755ac4fcec22" Dec 16 12:15:56.748851 systemd[1]: cri-containerd-56498aec7695fddc8d0bb0d480484e8d4e90812a5bb1505d4dd297a803af1235.scope: Deactivated successfully. Dec 16 12:15:56.749343 systemd[1]: cri-containerd-56498aec7695fddc8d0bb0d480484e8d4e90812a5bb1505d4dd297a803af1235.scope: Consumed 40.106s CPU time, 112.3M memory peak. Dec 16 12:15:56.750245 containerd[1662]: time="2025-12-16T12:15:56.750214171Z" level=info msg="received container exit event container_id:\"56498aec7695fddc8d0bb0d480484e8d4e90812a5bb1505d4dd297a803af1235\" id:\"56498aec7695fddc8d0bb0d480484e8d4e90812a5bb1505d4dd297a803af1235\" pid:3255 exit_status:1 exited_at:{seconds:1765887356 nanos:749814650}" Dec 16 12:15:56.755000 audit: BPF prog-id=146 op=UNLOAD Dec 16 12:15:56.757668 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:15:56.757725 kernel: audit: type=1334 audit(1765887356.755:888): prog-id=146 op=UNLOAD Dec 16 12:15:56.757916 kernel: audit: type=1334 audit(1765887356.755:889): prog-id=150 op=UNLOAD Dec 16 12:15:56.755000 audit: BPF prog-id=150 op=UNLOAD Dec 16 12:15:56.758739 systemd[1]: cri-containerd-5294eb4ec96867df0758048164e3b9ca486f504dc6ff6d548203471650c69433.scope: Deactivated successfully. Dec 16 12:15:56.759654 containerd[1662]: time="2025-12-16T12:15:56.759595077Z" level=info msg="received container exit event container_id:\"5294eb4ec96867df0758048164e3b9ca486f504dc6ff6d548203471650c69433\" id:\"5294eb4ec96867df0758048164e3b9ca486f504dc6ff6d548203471650c69433\" pid:2769 exit_status:1 exited_at:{seconds:1765887356 nanos:759228036}" Dec 16 12:15:56.759830 systemd[1]: cri-containerd-5294eb4ec96867df0758048164e3b9ca486f504dc6ff6d548203471650c69433.scope: Consumed 4.801s CPU time, 61.5M memory peak. Dec 16 12:15:56.759000 audit: BPF prog-id=256 op=LOAD Dec 16 12:15:56.761875 kernel: audit: type=1334 audit(1765887356.759:890): prog-id=256 op=LOAD Dec 16 12:15:56.759000 audit: BPF prog-id=93 op=UNLOAD Dec 16 12:15:56.762930 kernel: audit: type=1334 audit(1765887356.759:891): prog-id=93 op=UNLOAD Dec 16 12:15:56.762000 audit: BPF prog-id=108 op=UNLOAD Dec 16 12:15:56.764597 kernel: audit: type=1334 audit(1765887356.762:892): prog-id=108 op=UNLOAD Dec 16 12:15:56.762000 audit: BPF prog-id=112 op=UNLOAD Dec 16 12:15:56.765798 kernel: audit: type=1334 audit(1765887356.762:893): prog-id=112 op=UNLOAD Dec 16 12:15:56.779474 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-56498aec7695fddc8d0bb0d480484e8d4e90812a5bb1505d4dd297a803af1235-rootfs.mount: Deactivated successfully. Dec 16 12:15:56.786265 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-5294eb4ec96867df0758048164e3b9ca486f504dc6ff6d548203471650c69433-rootfs.mount: Deactivated successfully. Dec 16 12:15:56.987759 kubelet[2916]: E1216 12:15:56.987718 2916 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.21.106:52824->10.0.21.104:2379: read: connection timed out" Dec 16 12:15:56.990566 systemd[1]: cri-containerd-d8fda76280e91957933b5c537b46a2581e6bdb2c7d054be2f6ae8edc941b6d86.scope: Deactivated successfully. Dec 16 12:15:56.990891 systemd[1]: cri-containerd-d8fda76280e91957933b5c537b46a2581e6bdb2c7d054be2f6ae8edc941b6d86.scope: Consumed 2.709s CPU time, 25.7M memory peak. Dec 16 12:15:56.990000 audit: BPF prog-id=257 op=LOAD Dec 16 12:15:56.992215 containerd[1662]: time="2025-12-16T12:15:56.992185845Z" level=info msg="received container exit event container_id:\"d8fda76280e91957933b5c537b46a2581e6bdb2c7d054be2f6ae8edc941b6d86\" id:\"d8fda76280e91957933b5c537b46a2581e6bdb2c7d054be2f6ae8edc941b6d86\" pid:2736 exit_status:1 exited_at:{seconds:1765887356 nanos:991770684}" Dec 16 12:15:56.990000 audit: BPF prog-id=83 op=UNLOAD Dec 16 12:15:56.994058 kernel: audit: type=1334 audit(1765887356.990:894): prog-id=257 op=LOAD Dec 16 12:15:56.994123 kernel: audit: type=1334 audit(1765887356.990:895): prog-id=83 op=UNLOAD Dec 16 12:15:56.995000 audit: BPF prog-id=98 op=UNLOAD Dec 16 12:15:56.995000 audit: BPF prog-id=102 op=UNLOAD Dec 16 12:15:56.997737 kernel: audit: type=1334 audit(1765887356.995:896): prog-id=98 op=UNLOAD Dec 16 12:15:56.997787 kernel: audit: type=1334 audit(1765887356.995:897): prog-id=102 op=UNLOAD Dec 16 12:15:57.013383 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-d8fda76280e91957933b5c537b46a2581e6bdb2c7d054be2f6ae8edc941b6d86-rootfs.mount: Deactivated successfully. Dec 16 12:15:57.598107 kubelet[2916]: I1216 12:15:57.597993 2916 scope.go:117] "RemoveContainer" containerID="5294eb4ec96867df0758048164e3b9ca486f504dc6ff6d548203471650c69433" Dec 16 12:15:57.599680 kubelet[2916]: I1216 12:15:57.599575 2916 scope.go:117] "RemoveContainer" containerID="d8fda76280e91957933b5c537b46a2581e6bdb2c7d054be2f6ae8edc941b6d86" Dec 16 12:15:57.599940 containerd[1662]: time="2025-12-16T12:15:57.599906417Z" level=info msg="CreateContainer within sandbox \"f3c949d5eb2de06768a20d3a1fbdecb5ff381edaa57bbe989a906de9099d1dd3\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Dec 16 12:15:57.601639 containerd[1662]: time="2025-12-16T12:15:57.601604382Z" level=info msg="CreateContainer within sandbox \"d835d90b56b547c926ecd00c51e5c10b0f38fb0cf9d6bdb07a1a859d00bf7fbc\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:1,}" Dec 16 12:15:57.602021 kubelet[2916]: I1216 12:15:57.602003 2916 scope.go:117] "RemoveContainer" containerID="56498aec7695fddc8d0bb0d480484e8d4e90812a5bb1505d4dd297a803af1235" Dec 16 12:15:57.605667 containerd[1662]: time="2025-12-16T12:15:57.604613750Z" level=info msg="CreateContainer within sandbox \"a1b53623f700cdcb3a3c2e891eae7a11e5b6419fdcf7a97c964cb547fc1958f6\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Dec 16 12:15:57.617650 containerd[1662]: time="2025-12-16T12:15:57.617589027Z" level=info msg="Container 5f68d0a7cfaa56fb239b44c0ecdf2f3770882efee3e642a23d4c1fcb2f8b2830: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:15:57.621856 containerd[1662]: time="2025-12-16T12:15:57.621800638Z" level=info msg="Container 273ac75297a69fe83b22e6071819d6e389e7582bd251abe2d4d6d0b1879d8843: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:15:57.630867 containerd[1662]: time="2025-12-16T12:15:57.630708943Z" level=info msg="Container 7f53f78fe6231e77361ae1ebd8e659089d7259c0c7e67b35f80dd28e70d23ae5: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:15:57.631376 containerd[1662]: time="2025-12-16T12:15:57.631329665Z" level=info msg="CreateContainer within sandbox \"f3c949d5eb2de06768a20d3a1fbdecb5ff381edaa57bbe989a906de9099d1dd3\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"5f68d0a7cfaa56fb239b44c0ecdf2f3770882efee3e642a23d4c1fcb2f8b2830\"" Dec 16 12:15:57.632677 containerd[1662]: time="2025-12-16T12:15:57.632077987Z" level=info msg="StartContainer for \"5f68d0a7cfaa56fb239b44c0ecdf2f3770882efee3e642a23d4c1fcb2f8b2830\"" Dec 16 12:15:57.633505 containerd[1662]: time="2025-12-16T12:15:57.633477791Z" level=info msg="connecting to shim 5f68d0a7cfaa56fb239b44c0ecdf2f3770882efee3e642a23d4c1fcb2f8b2830" address="unix:///run/containerd/s/5d9cfee8c3fc0fabcf736b39c7b4a783ada94d56625febe33804a190c85a7405" protocol=ttrpc version=3 Dec 16 12:15:57.639303 containerd[1662]: time="2025-12-16T12:15:57.639167767Z" level=info msg="CreateContainer within sandbox \"d835d90b56b547c926ecd00c51e5c10b0f38fb0cf9d6bdb07a1a859d00bf7fbc\" for &ContainerMetadata{Name:kube-scheduler,Attempt:1,} returns container id \"273ac75297a69fe83b22e6071819d6e389e7582bd251abe2d4d6d0b1879d8843\"" Dec 16 12:15:57.640887 containerd[1662]: time="2025-12-16T12:15:57.640843651Z" level=info msg="StartContainer for \"273ac75297a69fe83b22e6071819d6e389e7582bd251abe2d4d6d0b1879d8843\"" Dec 16 12:15:57.643885 containerd[1662]: time="2025-12-16T12:15:57.643850060Z" level=info msg="CreateContainer within sandbox \"a1b53623f700cdcb3a3c2e891eae7a11e5b6419fdcf7a97c964cb547fc1958f6\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"7f53f78fe6231e77361ae1ebd8e659089d7259c0c7e67b35f80dd28e70d23ae5\"" Dec 16 12:15:57.644392 containerd[1662]: time="2025-12-16T12:15:57.644344021Z" level=info msg="StartContainer for \"7f53f78fe6231e77361ae1ebd8e659089d7259c0c7e67b35f80dd28e70d23ae5\"" Dec 16 12:15:57.645158 containerd[1662]: time="2025-12-16T12:15:57.645127143Z" level=info msg="connecting to shim 7f53f78fe6231e77361ae1ebd8e659089d7259c0c7e67b35f80dd28e70d23ae5" address="unix:///run/containerd/s/976dd9f4f6eac049d9ad1dfc2bfa02cad45e04cc04c56b7aa554dfda47ae85ac" protocol=ttrpc version=3 Dec 16 12:15:57.645311 containerd[1662]: time="2025-12-16T12:15:57.645286304Z" level=info msg="connecting to shim 273ac75297a69fe83b22e6071819d6e389e7582bd251abe2d4d6d0b1879d8843" address="unix:///run/containerd/s/44effb07867acd3796bb793fad9e9b9ed68813e6abb744acc4d234fc9e1337d7" protocol=ttrpc version=3 Dec 16 12:15:57.656873 systemd[1]: Started cri-containerd-5f68d0a7cfaa56fb239b44c0ecdf2f3770882efee3e642a23d4c1fcb2f8b2830.scope - libcontainer container 5f68d0a7cfaa56fb239b44c0ecdf2f3770882efee3e642a23d4c1fcb2f8b2830. Dec 16 12:15:57.672917 systemd[1]: Started cri-containerd-273ac75297a69fe83b22e6071819d6e389e7582bd251abe2d4d6d0b1879d8843.scope - libcontainer container 273ac75297a69fe83b22e6071819d6e389e7582bd251abe2d4d6d0b1879d8843. Dec 16 12:15:57.674737 systemd[1]: Started cri-containerd-7f53f78fe6231e77361ae1ebd8e659089d7259c0c7e67b35f80dd28e70d23ae5.scope - libcontainer container 7f53f78fe6231e77361ae1ebd8e659089d7259c0c7e67b35f80dd28e70d23ae5. Dec 16 12:15:57.682000 audit: BPF prog-id=258 op=LOAD Dec 16 12:15:57.682000 audit: BPF prog-id=259 op=LOAD Dec 16 12:15:57.682000 audit[5665]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2633 pid=5665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:57.682000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566363864306137636661613536666232333962343463306563646632 Dec 16 12:15:57.685000 audit: BPF prog-id=259 op=UNLOAD Dec 16 12:15:57.685000 audit[5665]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2633 pid=5665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:57.685000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566363864306137636661613536666232333962343463306563646632 Dec 16 12:15:57.685000 audit: BPF prog-id=260 op=LOAD Dec 16 12:15:57.685000 audit[5665]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2633 pid=5665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:57.685000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566363864306137636661613536666232333962343463306563646632 Dec 16 12:15:57.685000 audit: BPF prog-id=261 op=LOAD Dec 16 12:15:57.685000 audit[5665]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2633 pid=5665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:57.685000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566363864306137636661613536666232333962343463306563646632 Dec 16 12:15:57.686000 audit: BPF prog-id=261 op=UNLOAD Dec 16 12:15:57.686000 audit[5665]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2633 pid=5665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:57.686000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566363864306137636661613536666232333962343463306563646632 Dec 16 12:15:57.686000 audit: BPF prog-id=260 op=UNLOAD Dec 16 12:15:57.686000 audit[5665]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2633 pid=5665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:57.686000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566363864306137636661613536666232333962343463306563646632 Dec 16 12:15:57.687000 audit: BPF prog-id=262 op=LOAD Dec 16 12:15:57.687000 audit[5665]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2633 pid=5665 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:57.687000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3566363864306137636661613536666232333962343463306563646632 Dec 16 12:15:57.688000 audit: BPF prog-id=263 op=LOAD Dec 16 12:15:57.689000 audit: BPF prog-id=264 op=LOAD Dec 16 12:15:57.689000 audit[5677]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2583 pid=5677 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:57.689000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3237336163373532393761363966653833623232653630373138313964 Dec 16 12:15:57.689000 audit: BPF prog-id=264 op=UNLOAD Dec 16 12:15:57.689000 audit[5677]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2583 pid=5677 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:57.689000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3237336163373532393761363966653833623232653630373138313964 Dec 16 12:15:57.689000 audit: BPF prog-id=265 op=LOAD Dec 16 12:15:57.689000 audit[5677]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2583 pid=5677 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:57.689000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3237336163373532393761363966653833623232653630373138313964 Dec 16 12:15:57.689000 audit: BPF prog-id=266 op=LOAD Dec 16 12:15:57.689000 audit[5677]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2583 pid=5677 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:57.689000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3237336163373532393761363966653833623232653630373138313964 Dec 16 12:15:57.689000 audit: BPF prog-id=266 op=UNLOAD Dec 16 12:15:57.689000 audit[5677]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2583 pid=5677 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:57.689000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3237336163373532393761363966653833623232653630373138313964 Dec 16 12:15:57.689000 audit: BPF prog-id=265 op=UNLOAD Dec 16 12:15:57.689000 audit[5677]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2583 pid=5677 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:57.689000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3237336163373532393761363966653833623232653630373138313964 Dec 16 12:15:57.689000 audit: BPF prog-id=267 op=LOAD Dec 16 12:15:57.689000 audit[5677]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2583 pid=5677 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:57.689000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3237336163373532393761363966653833623232653630373138313964 Dec 16 12:15:57.691000 audit: BPF prog-id=268 op=LOAD Dec 16 12:15:57.693000 audit: BPF prog-id=269 op=LOAD Dec 16 12:15:57.693000 audit[5676]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=3036 pid=5676 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:57.693000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3766353366373866653632333165373733363161653165626438653635 Dec 16 12:15:57.693000 audit: BPF prog-id=269 op=UNLOAD Dec 16 12:15:57.693000 audit[5676]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3036 pid=5676 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:57.693000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3766353366373866653632333165373733363161653165626438653635 Dec 16 12:15:57.693000 audit: BPF prog-id=270 op=LOAD Dec 16 12:15:57.693000 audit[5676]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3036 pid=5676 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:57.693000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3766353366373866653632333165373733363161653165626438653635 Dec 16 12:15:57.693000 audit: BPF prog-id=271 op=LOAD Dec 16 12:15:57.693000 audit[5676]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3036 pid=5676 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:57.693000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3766353366373866653632333165373733363161653165626438653635 Dec 16 12:15:57.693000 audit: BPF prog-id=271 op=UNLOAD Dec 16 12:15:57.693000 audit[5676]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3036 pid=5676 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:57.693000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3766353366373866653632333165373733363161653165626438653635 Dec 16 12:15:57.693000 audit: BPF prog-id=270 op=UNLOAD Dec 16 12:15:57.693000 audit[5676]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3036 pid=5676 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:57.693000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3766353366373866653632333165373733363161653165626438653635 Dec 16 12:15:57.693000 audit: BPF prog-id=272 op=LOAD Dec 16 12:15:57.693000 audit[5676]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=3036 pid=5676 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:15:57.693000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3766353366373866653632333165373733363161653165626438653635 Dec 16 12:15:57.717292 containerd[1662]: time="2025-12-16T12:15:57.717182424Z" level=info msg="StartContainer for \"7f53f78fe6231e77361ae1ebd8e659089d7259c0c7e67b35f80dd28e70d23ae5\" returns successfully" Dec 16 12:15:57.729845 containerd[1662]: time="2025-12-16T12:15:57.729808579Z" level=info msg="StartContainer for \"273ac75297a69fe83b22e6071819d6e389e7582bd251abe2d4d6d0b1879d8843\" returns successfully" Dec 16 12:15:57.735905 containerd[1662]: time="2025-12-16T12:15:57.735868316Z" level=info msg="StartContainer for \"5f68d0a7cfaa56fb239b44c0ecdf2f3770882efee3e642a23d4c1fcb2f8b2830\" returns successfully" Dec 16 12:15:57.781781 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2166643107.mount: Deactivated successfully. Dec 16 12:15:58.067021 kubelet[2916]: E1216 12:15:58.066958 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-cfgbh" podUID="15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4" Dec 16 12:15:59.066347 kubelet[2916]: E1216 12:15:59.066269 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-ggqjc" podUID="e4dab5b6-c490-4927-98c0-45033692e43c" Dec 16 12:15:59.067542 containerd[1662]: time="2025-12-16T12:15:59.067446624Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:15:59.390034 containerd[1662]: time="2025-12-16T12:15:59.389965523Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:15:59.391395 containerd[1662]: time="2025-12-16T12:15:59.391288726Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:15:59.391395 containerd[1662]: time="2025-12-16T12:15:59.391341406Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:15:59.391548 kubelet[2916]: E1216 12:15:59.391483 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:15:59.391548 kubelet[2916]: E1216 12:15:59.391530 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:15:59.391907 kubelet[2916]: E1216 12:15:59.391656 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker,Image:ghcr.io/flatcar/calico/whisker:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:CALICO_VERSION,Value:v3.30.4,ValueFrom:nil,},EnvVar{Name:CLUSTER_ID,Value:62693b2085a94b8ea373cfdd88d47738,ValueFrom:nil,},EnvVar{Name:CLUSTER_TYPE,Value:typha,kdd,k8s,operator,bgp,kubeadm,ValueFrom:nil,},EnvVar{Name:NOTIFICATIONS,Value:Enabled,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kube-api-access-9842c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5996544d9f-ldq7h_calico-system(9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:15:59.393588 containerd[1662]: time="2025-12-16T12:15:59.393553613Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:15:59.918553 containerd[1662]: time="2025-12-16T12:15:59.918500714Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:15:59.920214 containerd[1662]: time="2025-12-16T12:15:59.920146039Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:15:59.920290 containerd[1662]: time="2025-12-16T12:15:59.920241359Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:15:59.920448 kubelet[2916]: E1216 12:15:59.920389 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:15:59.920448 kubelet[2916]: E1216 12:15:59.920439 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:15:59.920599 kubelet[2916]: E1216 12:15:59.920549 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:whisker-backend,Image:ghcr.io/flatcar/calico/whisker-backend:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:3002,ValueFrom:nil,},EnvVar{Name:GOLDMANE_HOST,Value:goldmane.calico-system.svc.cluster.local:7443,ValueFrom:nil,},EnvVar{Name:TLS_CERT_PATH,Value:/whisker-backend-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:TLS_KEY_PATH,Value:/whisker-backend-key-pair/tls.key,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:whisker-backend-key-pair,ReadOnly:true,MountPath:/whisker-backend-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:whisker-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-9842c,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod whisker-5996544d9f-ldq7h_calico-system(9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:15:59.921782 kubelet[2916]: E1216 12:15:59.921698 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5996544d9f-ldq7h" podUID="9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb" Dec 16 12:16:00.571161 kubelet[2916]: E1216 12:16:00.571041 2916 event.go:359] "Server rejected event (will not retry!)" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.21.106:52678->10.0.21.104:2379: read: connection timed out" event="&Event{ObjectMeta:{kube-apiserver-ci-4547-0-0-4-c6e23b3406.1881b130593d6c5a kube-system 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:kube-system,Name:kube-apiserver-ci-4547-0-0-4-c6e23b3406,UID:ab47c6a6003114c72e61300e646e4520,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{kube-apiserver},},Reason:Unhealthy,Message:Readiness probe failed: HTTP probe failed with statuscode: 500,Source:EventSource{Component:kubelet,Host:ci-4547-0-0-4-c6e23b3406,},FirstTimestamp:2025-12-16 12:15:50.119689306 +0000 UTC m=+213.132001189,LastTimestamp:2025-12-16 12:15:50.119689306 +0000 UTC m=+213.132001189,Count:1,Type:Warning,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4547-0-0-4-c6e23b3406,}" Dec 16 12:16:02.939158 kubelet[2916]: I1216 12:16:02.939096 2916 status_manager.go:895] "Failed to get status for pod" podUID="089c865778d94423bfd6c7ecfbcac378" pod="kube-system/kube-controller-manager-ci-4547-0-0-4-c6e23b3406" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.21.106:52774->10.0.21.104:2379: read: connection timed out" Dec 16 12:16:03.065125 kubelet[2916]: E1216 12:16:03.065047 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-djtxq" podUID="fd2217b9-d056-43fe-a117-755ac4fcec22" Dec 16 12:16:03.065566 containerd[1662]: time="2025-12-16T12:16:03.065463318Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:16:03.401196 containerd[1662]: time="2025-12-16T12:16:03.401141333Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:03.402814 containerd[1662]: time="2025-12-16T12:16:03.402764738Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:16:03.402902 containerd[1662]: time="2025-12-16T12:16:03.402853778Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:03.403085 kubelet[2916]: E1216 12:16:03.403019 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:16:03.403147 kubelet[2916]: E1216 12:16:03.403077 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:16:03.403282 kubelet[2916]: E1216 12:16:03.403220 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-kube-controllers,Image:ghcr.io/flatcar/calico/kube-controllers:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:KUBE_CONTROLLERS_CONFIG_NAME,Value:default,ValueFrom:nil,},EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:ENABLED_CONTROLLERS,Value:node,loadbalancer,ValueFrom:nil,},EnvVar{Name:DISABLE_KUBE_CONTROLLERS_CONFIG_API,Value:false,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:CA_CRT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:tigera-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/cert.pem,SubPath:ca-bundle.crt,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-xgkkb,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -l],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:10,TimeoutSeconds:10,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:6,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/usr/bin/check-status -r],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:10,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*999,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-kube-controllers-5d8494b5b4-8w8m7_calico-system(9aa83703-a98a-4b61-a3ff-9d2f2ed50abd): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:03.404436 kubelet[2916]: E1216 12:16:03.404386 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8494b5b4-8w8m7" podUID="9aa83703-a98a-4b61-a3ff-9d2f2ed50abd" Dec 16 12:16:06.065594 containerd[1662]: time="2025-12-16T12:16:06.065543393Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:16:06.400253 containerd[1662]: time="2025-12-16T12:16:06.400142285Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:06.402840 containerd[1662]: time="2025-12-16T12:16:06.402743893Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:16:06.403011 containerd[1662]: time="2025-12-16T12:16:06.402848373Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:06.403228 kubelet[2916]: E1216 12:16:06.403098 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:16:06.403228 kubelet[2916]: E1216 12:16:06.403185 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:16:06.403753 kubelet[2916]: E1216 12:16:06.403313 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:goldmane,Image:ghcr.io/flatcar/calico/goldmane:v3.30.4,Command:[],Args:[],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:INFO,ValueFrom:nil,},EnvVar{Name:PORT,Value:7443,ValueFrom:nil,},EnvVar{Name:SERVER_CERT_PATH,Value:/goldmane-key-pair/tls.crt,ValueFrom:nil,},EnvVar{Name:SERVER_KEY_PATH,Value:/goldmane-key-pair/tls.key,ValueFrom:nil,},EnvVar{Name:CA_CERT_PATH,Value:/etc/pki/tls/certs/tigera-ca-bundle.crt,ValueFrom:nil,},EnvVar{Name:PUSH_URL,Value:https://guardian.calico-system.svc.cluster.local:443/api/v1/flows/bulk,ValueFrom:nil,},EnvVar{Name:FILE_CONFIG_PATH,Value:/config/config.json,ValueFrom:nil,},EnvVar{Name:HEALTH_ENABLED,Value:true,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:config,ReadOnly:true,MountPath:/config,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-ca-bundle,ReadOnly:true,MountPath:/etc/pki/tls/certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:goldmane-key-pair,ReadOnly:true,MountPath:/goldmane-key-pair,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-5cprm,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -live],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:&ExecAction{Command:[/health -ready],},HTTPGet:nil,TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:30,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod goldmane-666569f655-7qscd_calico-system(a0cf5bad-dc9b-4b29-b409-00ae25b08c7c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:06.404568 kubelet[2916]: E1216 12:16:06.404524 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-666569f655-7qscd" podUID="a0cf5bad-dc9b-4b29-b409-00ae25b08c7c" Dec 16 12:16:06.988882 kubelet[2916]: E1216 12:16:06.988397 2916 controller.go:195] "Failed to update lease" err="Put \"https://10.0.21.106:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4547-0-0-4-c6e23b3406?timeout=10s\": context deadline exceeded" Dec 16 12:16:08.919993 systemd[1]: cri-containerd-7f53f78fe6231e77361ae1ebd8e659089d7259c0c7e67b35f80dd28e70d23ae5.scope: Deactivated successfully. Dec 16 12:16:08.920745 containerd[1662]: time="2025-12-16T12:16:08.920355304Z" level=info msg="received container exit event container_id:\"7f53f78fe6231e77361ae1ebd8e659089d7259c0c7e67b35f80dd28e70d23ae5\" id:\"7f53f78fe6231e77361ae1ebd8e659089d7259c0c7e67b35f80dd28e70d23ae5\" pid:5714 exit_status:1 exited_at:{seconds:1765887368 nanos:920174183}" Dec 16 12:16:08.927000 audit: BPF prog-id=268 op=UNLOAD Dec 16 12:16:08.929446 kernel: kauditd_printk_skb: 66 callbacks suppressed Dec 16 12:16:08.929552 kernel: audit: type=1334 audit(1765887368.927:922): prog-id=268 op=UNLOAD Dec 16 12:16:08.929575 kernel: audit: type=1334 audit(1765887368.927:923): prog-id=272 op=UNLOAD Dec 16 12:16:08.927000 audit: BPF prog-id=272 op=UNLOAD Dec 16 12:16:08.942782 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7f53f78fe6231e77361ae1ebd8e659089d7259c0c7e67b35f80dd28e70d23ae5-rootfs.mount: Deactivated successfully. Dec 16 12:16:09.065262 containerd[1662]: time="2025-12-16T12:16:09.065067667Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:16:09.383530 containerd[1662]: time="2025-12-16T12:16:09.383455954Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:09.384922 containerd[1662]: time="2025-12-16T12:16:09.384871718Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:16:09.385014 containerd[1662]: time="2025-12-16T12:16:09.384959118Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:09.385170 kubelet[2916]: E1216 12:16:09.385125 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:16:09.385451 kubelet[2916]: E1216 12:16:09.385179 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:16:09.385451 kubelet[2916]: E1216 12:16:09.385301 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-csi,Image:ghcr.io/flatcar/calico/csi:v3.30.4,Command:[],Args:[--nodeid=$(KUBE_NODE_NAME) --loglevel=$(LOG_LEVEL)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:LOG_LEVEL,Value:warn,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:kubelet-dir,ReadOnly:false,MountPath:/var/lib/kubelet,SubPath:,MountPropagation:*Bidirectional,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:varrun,ReadOnly:false,MountPath:/var/run,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z4h5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-cfgbh_calico-system(15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:09.387377 containerd[1662]: time="2025-12-16T12:16:09.387352124Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:16:09.639514 kubelet[2916]: I1216 12:16:09.639334 2916 scope.go:117] "RemoveContainer" containerID="56498aec7695fddc8d0bb0d480484e8d4e90812a5bb1505d4dd297a803af1235" Dec 16 12:16:09.639842 kubelet[2916]: I1216 12:16:09.639799 2916 scope.go:117] "RemoveContainer" containerID="7f53f78fe6231e77361ae1ebd8e659089d7259c0c7e67b35f80dd28e70d23ae5" Dec 16 12:16:09.640008 kubelet[2916]: E1216 12:16:09.639985 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"tigera-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=tigera-operator pod=tigera-operator-7dcd859c48-m687x_tigera-operator(561879bb-4f13-4ab8-8d45-be9ae9ec7afc)\"" pod="tigera-operator/tigera-operator-7dcd859c48-m687x" podUID="561879bb-4f13-4ab8-8d45-be9ae9ec7afc" Dec 16 12:16:09.641160 containerd[1662]: time="2025-12-16T12:16:09.641130991Z" level=info msg="RemoveContainer for \"56498aec7695fddc8d0bb0d480484e8d4e90812a5bb1505d4dd297a803af1235\"" Dec 16 12:16:09.650315 containerd[1662]: time="2025-12-16T12:16:09.650261217Z" level=info msg="RemoveContainer for \"56498aec7695fddc8d0bb0d480484e8d4e90812a5bb1505d4dd297a803af1235\" returns successfully" Dec 16 12:16:09.728693 containerd[1662]: time="2025-12-16T12:16:09.728651555Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:09.730361 containerd[1662]: time="2025-12-16T12:16:09.730250639Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:16:09.730361 containerd[1662]: time="2025-12-16T12:16:09.730301480Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:09.730545 kubelet[2916]: E1216 12:16:09.730476 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:16:09.730545 kubelet[2916]: E1216 12:16:09.730525 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:16:09.730706 kubelet[2916]: E1216 12:16:09.730670 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:csi-node-driver-registrar,Image:ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4,Command:[],Args:[--v=5 --csi-address=$(ADDRESS) --kubelet-registration-path=$(DRIVER_REG_SOCK_PATH)],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:ADDRESS,Value:/csi/csi.sock,ValueFrom:nil,},EnvVar{Name:DRIVER_REG_SOCK_PATH,Value:/var/lib/kubelet/plugins/csi.tigera.io/csi.sock,ValueFrom:nil,},EnvVar{Name:KUBE_NODE_NAME,Value:,ValueFrom:&EnvVarSource{FieldRef:&ObjectFieldSelector{APIVersion:v1,FieldPath:spec.nodeName,},ResourceFieldRef:nil,ConfigMapKeyRef:nil,SecretKeyRef:nil,},},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:registration-dir,ReadOnly:false,MountPath:/registration,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:socket-dir,ReadOnly:false,MountPath:/csi,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-z4h5p,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*true,SELinuxOptions:nil,RunAsUser:*0,RunAsNonRoot:*false,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*true,RunAsGroup:*0,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod csi-node-driver-cfgbh_calico-system(15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:09.731895 kubelet[2916]: E1216 12:16:09.731856 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-cfgbh" podUID="15e1938f-0bd9-43f1-a8b8-ecedb5c4b1c4" Dec 16 12:16:11.066211 containerd[1662]: time="2025-12-16T12:16:11.066089720Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:16:11.424351 containerd[1662]: time="2025-12-16T12:16:11.424269117Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:16:11.425875 containerd[1662]: time="2025-12-16T12:16:11.425800841Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:16:11.425977 containerd[1662]: time="2025-12-16T12:16:11.425885802Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:16:11.426051 kubelet[2916]: E1216 12:16:11.426012 2916 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:16:11.426355 kubelet[2916]: E1216 12:16:11.426060 2916 kuberuntime_image.go:42] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:16:11.426355 kubelet[2916]: E1216 12:16:11.426191 2916 kuberuntime_manager.go:1358] "Unhandled Error" err="container &Container{Name:calico-apiserver,Image:ghcr.io/flatcar/calico/apiserver:v3.30.4,Command:[],Args:[--secure-port=5443 --tls-private-key-file=/calico-apiserver-certs/tls.key --tls-cert-file=/calico-apiserver-certs/tls.crt],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{EnvVar{Name:DATASTORE_TYPE,Value:kubernetes,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_HOST,Value:10.96.0.1,ValueFrom:nil,},EnvVar{Name:KUBERNETES_SERVICE_PORT,Value:443,ValueFrom:nil,},EnvVar{Name:LOG_LEVEL,Value:info,ValueFrom:nil,},EnvVar{Name:MULTI_INTERFACE_MODE,Value:none,ValueFrom:nil,},},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:calico-apiserver-certs,ReadOnly:true,MountPath:/calico-apiserver-certs,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-hsckl,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:&Probe{ProbeHandler:ProbeHandler{Exec:nil,HTTPGet:&HTTPGetAction{Path:/readyz,Port:{0 5443 },Host:,Scheme:HTTPS,HTTPHeaders:[]HTTPHeader{},},TCPSocket:nil,GRPC:nil,},InitialDelaySeconds:0,TimeoutSeconds:5,PeriodSeconds:60,SuccessThreshold:1,FailureThreshold:3,TerminationGracePeriodSeconds:nil,},Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:IfNotPresent,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:*false,SELinuxOptions:nil,RunAsUser:*10001,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:*10001,ProcMount:nil,WindowsOptions:nil,SeccompProfile:&SeccompProfile{Type:RuntimeDefault,LocalhostProfile:nil,},AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:File,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod calico-apiserver-585b8759c9-ggqjc_calico-apiserver(e4dab5b6-c490-4927-98c0-45033692e43c): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:16:11.427387 kubelet[2916]: E1216 12:16:11.427353 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-ggqjc" podUID="e4dab5b6-c490-4927-98c0-45033692e43c" Dec 16 12:16:14.064937 kubelet[2916]: E1216 12:16:14.064870 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-585b8759c9-djtxq" podUID="fd2217b9-d056-43fe-a117-755ac4fcec22" Dec 16 12:16:15.065401 kubelet[2916]: E1216 12:16:15.065326 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5996544d9f-ldq7h" podUID="9d7a60ea-0d84-4a1a-8fbd-83649fe49cfb" Dec 16 12:16:16.989664 kubelet[2916]: E1216 12:16:16.989600 2916 controller.go:195] "Failed to update lease" err="the server was unable to return a response in the time allotted, but may still be processing the request (put leases.coordination.k8s.io ci-4547-0-0-4-c6e23b3406)" Dec 16 12:16:17.065944 kubelet[2916]: E1216 12:16:17.065850 2916 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-5d8494b5b4-8w8m7" podUID="9aa83703-a98a-4b61-a3ff-9d2f2ed50abd" Dec 16 12:16:18.330677 kernel: pcieport 0000:00:01.0: pciehp: Slot(0): Button press: will power off in 5 sec