Dec 16 12:23:47.452871 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Dec 16 12:23:47.452895 kernel: Linux version 6.12.61-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT Fri Dec 12 15:17:36 -00 2025 Dec 16 12:23:47.452906 kernel: KASLR enabled Dec 16 12:23:47.452912 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Dec 16 12:23:47.452917 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390bb018 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b41218 Dec 16 12:23:47.452923 kernel: random: crng init done Dec 16 12:23:47.452930 kernel: secureboot: Secure boot disabled Dec 16 12:23:47.452936 kernel: ACPI: Early table checksum verification disabled Dec 16 12:23:47.452942 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Dec 16 12:23:47.452950 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Dec 16 12:23:47.452956 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:23:47.452962 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:23:47.452968 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:23:47.452975 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:23:47.452983 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:23:47.452990 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:23:47.452997 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:23:47.453003 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:23:47.453010 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Dec 16 12:23:47.453016 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Dec 16 12:23:47.453023 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Dec 16 12:23:47.453029 kernel: ACPI: Use ACPI SPCR as default console: Yes Dec 16 12:23:47.453035 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Dec 16 12:23:47.453043 kernel: NODE_DATA(0) allocated [mem 0x13967da00-0x139684fff] Dec 16 12:23:47.453049 kernel: Zone ranges: Dec 16 12:23:47.453056 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Dec 16 12:23:47.453062 kernel: DMA32 empty Dec 16 12:23:47.453068 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Dec 16 12:23:47.453075 kernel: Device empty Dec 16 12:23:47.453081 kernel: Movable zone start for each node Dec 16 12:23:47.453087 kernel: Early memory node ranges Dec 16 12:23:47.453094 kernel: node 0: [mem 0x0000000040000000-0x000000013666ffff] Dec 16 12:23:47.453100 kernel: node 0: [mem 0x0000000136670000-0x000000013667ffff] Dec 16 12:23:47.453106 kernel: node 0: [mem 0x0000000136680000-0x000000013676ffff] Dec 16 12:23:47.453113 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Dec 16 12:23:47.453120 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Dec 16 12:23:47.453127 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Dec 16 12:23:47.453133 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Dec 16 12:23:47.453139 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Dec 16 12:23:47.453146 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Dec 16 12:23:47.453155 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Dec 16 12:23:47.453164 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Dec 16 12:23:47.453171 kernel: cma: Reserved 16 MiB at 0x00000000ff000000 on node -1 Dec 16 12:23:47.453177 kernel: psci: probing for conduit method from ACPI. Dec 16 12:23:47.453184 kernel: psci: PSCIv1.1 detected in firmware. Dec 16 12:23:47.453191 kernel: psci: Using standard PSCI v0.2 function IDs Dec 16 12:23:47.453198 kernel: psci: Trusted OS migration not required Dec 16 12:23:47.453204 kernel: psci: SMC Calling Convention v1.1 Dec 16 12:23:47.453217 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Dec 16 12:23:47.453225 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Dec 16 12:23:47.453232 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Dec 16 12:23:47.453239 kernel: pcpu-alloc: [0] 0 [0] 1 Dec 16 12:23:47.453246 kernel: Detected PIPT I-cache on CPU0 Dec 16 12:23:47.453252 kernel: CPU features: detected: GIC system register CPU interface Dec 16 12:23:47.453259 kernel: CPU features: detected: Spectre-v4 Dec 16 12:23:47.456455 kernel: CPU features: detected: Spectre-BHB Dec 16 12:23:47.456480 kernel: CPU features: kernel page table isolation forced ON by KASLR Dec 16 12:23:47.456487 kernel: CPU features: detected: Kernel page table isolation (KPTI) Dec 16 12:23:47.456494 kernel: CPU features: detected: ARM erratum 1418040 Dec 16 12:23:47.456502 kernel: CPU features: detected: SSBS not fully self-synchronizing Dec 16 12:23:47.456516 kernel: alternatives: applying boot alternatives Dec 16 12:23:47.456524 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=f511955c7ec069359d088640c1194932d6d915b5bb2829e8afbb591f10cd0849 Dec 16 12:23:47.456532 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Dec 16 12:23:47.456539 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Dec 16 12:23:47.456546 kernel: Fallback order for Node 0: 0 Dec 16 12:23:47.456553 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1024000 Dec 16 12:23:47.456560 kernel: Policy zone: Normal Dec 16 12:23:47.456567 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Dec 16 12:23:47.456574 kernel: software IO TLB: area num 2. Dec 16 12:23:47.456581 kernel: software IO TLB: mapped [mem 0x00000000fb000000-0x00000000ff000000] (64MB) Dec 16 12:23:47.456589 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Dec 16 12:23:47.456596 kernel: rcu: Preemptible hierarchical RCU implementation. Dec 16 12:23:47.456603 kernel: rcu: RCU event tracing is enabled. Dec 16 12:23:47.456610 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Dec 16 12:23:47.456618 kernel: Trampoline variant of Tasks RCU enabled. Dec 16 12:23:47.456625 kernel: Tracing variant of Tasks RCU enabled. Dec 16 12:23:47.456631 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Dec 16 12:23:47.456638 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Dec 16 12:23:47.456645 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Dec 16 12:23:47.456652 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Dec 16 12:23:47.456660 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Dec 16 12:23:47.456668 kernel: GICv3: 256 SPIs implemented Dec 16 12:23:47.456675 kernel: GICv3: 0 Extended SPIs implemented Dec 16 12:23:47.456682 kernel: Root IRQ handler: gic_handle_irq Dec 16 12:23:47.456688 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Dec 16 12:23:47.456695 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Dec 16 12:23:47.456702 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Dec 16 12:23:47.456709 kernel: ITS [mem 0x08080000-0x0809ffff] Dec 16 12:23:47.456716 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100100000 (indirect, esz 8, psz 64K, shr 1) Dec 16 12:23:47.456723 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100110000 (flat, esz 8, psz 64K, shr 1) Dec 16 12:23:47.456730 kernel: GICv3: using LPI property table @0x0000000100120000 Dec 16 12:23:47.456737 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100130000 Dec 16 12:23:47.456745 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Dec 16 12:23:47.456752 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 16 12:23:47.456759 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Dec 16 12:23:47.456766 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Dec 16 12:23:47.456772 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Dec 16 12:23:47.456779 kernel: Console: colour dummy device 80x25 Dec 16 12:23:47.456787 kernel: ACPI: Core revision 20240827 Dec 16 12:23:47.456795 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Dec 16 12:23:47.456802 kernel: pid_max: default: 32768 minimum: 301 Dec 16 12:23:47.456810 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Dec 16 12:23:47.456818 kernel: landlock: Up and running. Dec 16 12:23:47.456824 kernel: SELinux: Initializing. Dec 16 12:23:47.456831 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Dec 16 12:23:47.456839 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Dec 16 12:23:47.456846 kernel: rcu: Hierarchical SRCU implementation. Dec 16 12:23:47.456854 kernel: rcu: Max phase no-delay instances is 400. Dec 16 12:23:47.456861 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Dec 16 12:23:47.456870 kernel: Remapping and enabling EFI services. Dec 16 12:23:47.456877 kernel: smp: Bringing up secondary CPUs ... Dec 16 12:23:47.456884 kernel: Detected PIPT I-cache on CPU1 Dec 16 12:23:47.456892 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Dec 16 12:23:47.456899 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100140000 Dec 16 12:23:47.456906 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Dec 16 12:23:47.456914 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Dec 16 12:23:47.456922 kernel: smp: Brought up 1 node, 2 CPUs Dec 16 12:23:47.456930 kernel: SMP: Total of 2 processors activated. Dec 16 12:23:47.456942 kernel: CPU: All CPU(s) started at EL1 Dec 16 12:23:47.456950 kernel: CPU features: detected: 32-bit EL0 Support Dec 16 12:23:47.456958 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Dec 16 12:23:47.456966 kernel: CPU features: detected: Common not Private translations Dec 16 12:23:47.456974 kernel: CPU features: detected: CRC32 instructions Dec 16 12:23:47.456981 kernel: CPU features: detected: Enhanced Virtualization Traps Dec 16 12:23:47.456990 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Dec 16 12:23:47.456998 kernel: CPU features: detected: LSE atomic instructions Dec 16 12:23:47.457005 kernel: CPU features: detected: Privileged Access Never Dec 16 12:23:47.457013 kernel: CPU features: detected: RAS Extension Support Dec 16 12:23:47.457020 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Dec 16 12:23:47.457028 kernel: alternatives: applying system-wide alternatives Dec 16 12:23:47.457037 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Dec 16 12:23:47.457045 kernel: Memory: 3885988K/4096000K available (11200K kernel code, 2456K rwdata, 9084K rodata, 12416K init, 1038K bss, 188532K reserved, 16384K cma-reserved) Dec 16 12:23:47.457053 kernel: devtmpfs: initialized Dec 16 12:23:47.457061 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Dec 16 12:23:47.457068 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Dec 16 12:23:47.457076 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Dec 16 12:23:47.457084 kernel: 0 pages in range for non-PLT usage Dec 16 12:23:47.457093 kernel: 515184 pages in range for PLT usage Dec 16 12:23:47.457100 kernel: pinctrl core: initialized pinctrl subsystem Dec 16 12:23:47.457108 kernel: SMBIOS 3.0.0 present. Dec 16 12:23:47.457115 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Dec 16 12:23:47.457123 kernel: DMI: Memory slots populated: 1/1 Dec 16 12:23:47.457130 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Dec 16 12:23:47.457138 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Dec 16 12:23:47.457147 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Dec 16 12:23:47.457155 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Dec 16 12:23:47.457163 kernel: audit: initializing netlink subsys (disabled) Dec 16 12:23:47.457170 kernel: audit: type=2000 audit(0.011:1): state=initialized audit_enabled=0 res=1 Dec 16 12:23:47.457178 kernel: thermal_sys: Registered thermal governor 'step_wise' Dec 16 12:23:47.457185 kernel: cpuidle: using governor menu Dec 16 12:23:47.457193 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Dec 16 12:23:47.457202 kernel: ASID allocator initialised with 32768 entries Dec 16 12:23:47.457210 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Dec 16 12:23:47.457217 kernel: Serial: AMBA PL011 UART driver Dec 16 12:23:47.457225 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Dec 16 12:23:47.457232 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Dec 16 12:23:47.457240 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Dec 16 12:23:47.457248 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Dec 16 12:23:47.457255 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Dec 16 12:23:47.457273 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Dec 16 12:23:47.457282 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Dec 16 12:23:47.458774 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Dec 16 12:23:47.458790 kernel: ACPI: Added _OSI(Module Device) Dec 16 12:23:47.458798 kernel: ACPI: Added _OSI(Processor Device) Dec 16 12:23:47.458806 kernel: ACPI: Added _OSI(Processor Aggregator Device) Dec 16 12:23:47.458814 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Dec 16 12:23:47.458828 kernel: ACPI: Interpreter enabled Dec 16 12:23:47.458835 kernel: ACPI: Using GIC for interrupt routing Dec 16 12:23:47.458843 kernel: ACPI: MCFG table detected, 1 entries Dec 16 12:23:47.458851 kernel: ACPI: CPU0 has been hot-added Dec 16 12:23:47.458858 kernel: ACPI: CPU1 has been hot-added Dec 16 12:23:47.458866 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Dec 16 12:23:47.458873 kernel: printk: legacy console [ttyAMA0] enabled Dec 16 12:23:47.458882 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Dec 16 12:23:47.459070 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Dec 16 12:23:47.459160 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Dec 16 12:23:47.459244 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Dec 16 12:23:47.460413 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Dec 16 12:23:47.460505 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Dec 16 12:23:47.460521 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Dec 16 12:23:47.460529 kernel: PCI host bridge to bus 0000:00 Dec 16 12:23:47.460616 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Dec 16 12:23:47.460690 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Dec 16 12:23:47.460761 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Dec 16 12:23:47.460831 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Dec 16 12:23:47.460928 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Dec 16 12:23:47.461020 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 conventional PCI endpoint Dec 16 12:23:47.461105 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11289000-0x11289fff] Dec 16 12:23:47.461185 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref] Dec 16 12:23:47.462345 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:23:47.462471 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11288000-0x11288fff] Dec 16 12:23:47.462554 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Dec 16 12:23:47.463340 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Dec 16 12:23:47.463435 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Dec 16 12:23:47.463527 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:23:47.463607 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11287000-0x11287fff] Dec 16 12:23:47.463691 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Dec 16 12:23:47.463771 kernel: pci 0000:00:02.1: bridge window [mem 0x10e00000-0x10ffffff] Dec 16 12:23:47.463856 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:23:47.463935 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11286000-0x11286fff] Dec 16 12:23:47.464016 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Dec 16 12:23:47.464094 kernel: pci 0000:00:02.2: bridge window [mem 0x10c00000-0x10dfffff] Dec 16 12:23:47.464175 kernel: pci 0000:00:02.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Dec 16 12:23:47.464260 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:23:47.465342 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11285000-0x11285fff] Dec 16 12:23:47.465434 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Dec 16 12:23:47.465518 kernel: pci 0000:00:02.3: bridge window [mem 0x10a00000-0x10bfffff] Dec 16 12:23:47.465596 kernel: pci 0000:00:02.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Dec 16 12:23:47.465692 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:23:47.465772 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11284000-0x11284fff] Dec 16 12:23:47.465851 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Dec 16 12:23:47.465929 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Dec 16 12:23:47.466006 kernel: pci 0000:00:02.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Dec 16 12:23:47.466091 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:23:47.466172 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11283000-0x11283fff] Dec 16 12:23:47.466250 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Dec 16 12:23:47.466386 kernel: pci 0000:00:02.5: bridge window [mem 0x10600000-0x107fffff] Dec 16 12:23:47.466469 kernel: pci 0000:00:02.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Dec 16 12:23:47.466556 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:23:47.466639 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11282000-0x11282fff] Dec 16 12:23:47.466719 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Dec 16 12:23:47.466799 kernel: pci 0000:00:02.6: bridge window [mem 0x10400000-0x105fffff] Dec 16 12:23:47.466878 kernel: pci 0000:00:02.6: bridge window [mem 0x8000500000-0x80005fffff 64bit pref] Dec 16 12:23:47.466965 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:23:47.467045 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11281000-0x11281fff] Dec 16 12:23:47.467126 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Dec 16 12:23:47.467204 kernel: pci 0000:00:02.7: bridge window [mem 0x10200000-0x103fffff] Dec 16 12:23:47.467932 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Dec 16 12:23:47.468041 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11280000-0x11280fff] Dec 16 12:23:47.468121 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Dec 16 12:23:47.468206 kernel: pci 0000:00:03.0: bridge window [mem 0x10000000-0x101fffff] Dec 16 12:23:47.468391 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 conventional PCI endpoint Dec 16 12:23:47.468487 kernel: pci 0000:00:04.0: BAR 0 [io 0x0000-0x0007] Dec 16 12:23:47.468584 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Dec 16 12:23:47.468667 kernel: pci 0000:01:00.0: BAR 1 [mem 0x11000000-0x11000fff] Dec 16 12:23:47.468760 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Dec 16 12:23:47.468846 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Dec 16 12:23:47.468939 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Dec 16 12:23:47.469021 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10e00000-0x10e03fff 64bit] Dec 16 12:23:47.469111 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Dec 16 12:23:47.469191 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10c00000-0x10c00fff] Dec 16 12:23:47.469289 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Dec 16 12:23:47.469395 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Dec 16 12:23:47.469479 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Dec 16 12:23:47.469570 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Dec 16 12:23:47.469653 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff] Dec 16 12:23:47.469733 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Dec 16 12:23:47.469824 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Dec 16 12:23:47.469909 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10600000-0x10600fff] Dec 16 12:23:47.469992 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Dec 16 12:23:47.470084 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Dec 16 12:23:47.470168 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10400000-0x10400fff] Dec 16 12:23:47.470249 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000500000-0x8000503fff 64bit pref] Dec 16 12:23:47.470431 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] Dec 16 12:23:47.470520 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Dec 16 12:23:47.470602 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Dec 16 12:23:47.470683 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Dec 16 12:23:47.470769 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Dec 16 12:23:47.470853 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Dec 16 12:23:47.470931 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Dec 16 12:23:47.471013 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Dec 16 12:23:47.471092 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Dec 16 12:23:47.471170 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Dec 16 12:23:47.471250 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Dec 16 12:23:47.472459 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Dec 16 12:23:47.472553 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Dec 16 12:23:47.472637 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Dec 16 12:23:47.472717 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Dec 16 12:23:47.472795 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Dec 16 12:23:47.472884 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Dec 16 12:23:47.472976 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Dec 16 12:23:47.473055 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Dec 16 12:23:47.473138 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Dec 16 12:23:47.473218 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Dec 16 12:23:47.473352 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Dec 16 12:23:47.473452 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Dec 16 12:23:47.473532 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Dec 16 12:23:47.473613 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Dec 16 12:23:47.473695 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Dec 16 12:23:47.473776 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Dec 16 12:23:47.473855 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Dec 16 12:23:47.473938 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff]: assigned Dec 16 12:23:47.474018 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Dec 16 12:23:47.474099 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff]: assigned Dec 16 12:23:47.474179 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Dec 16 12:23:47.474259 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff]: assigned Dec 16 12:23:47.474370 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Dec 16 12:23:47.474455 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff]: assigned Dec 16 12:23:47.474534 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Dec 16 12:23:47.474614 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff]: assigned Dec 16 12:23:47.474692 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Dec 16 12:23:47.474771 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Dec 16 12:23:47.474849 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Dec 16 12:23:47.474930 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Dec 16 12:23:47.475008 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Dec 16 12:23:47.475087 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Dec 16 12:23:47.475165 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Dec 16 12:23:47.475244 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff]: assigned Dec 16 12:23:47.476492 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Dec 16 12:23:47.476613 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8001200000-0x8001203fff 64bit pref]: assigned Dec 16 12:23:47.476711 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11200000-0x11200fff]: assigned Dec 16 12:23:47.476792 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11201000-0x11201fff]: assigned Dec 16 12:23:47.476872 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff]: assigned Dec 16 12:23:47.476954 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11202000-0x11202fff]: assigned Dec 16 12:23:47.477037 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff]: assigned Dec 16 12:23:47.477118 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11203000-0x11203fff]: assigned Dec 16 12:23:47.477197 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff]: assigned Dec 16 12:23:47.477295 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11204000-0x11204fff]: assigned Dec 16 12:23:47.477422 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff]: assigned Dec 16 12:23:47.477507 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11205000-0x11205fff]: assigned Dec 16 12:23:47.477586 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff]: assigned Dec 16 12:23:47.477667 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11206000-0x11206fff]: assigned Dec 16 12:23:47.477746 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff]: assigned Dec 16 12:23:47.477827 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11207000-0x11207fff]: assigned Dec 16 12:23:47.477905 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff]: assigned Dec 16 12:23:47.477985 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11208000-0x11208fff]: assigned Dec 16 12:23:47.478066 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff]: assigned Dec 16 12:23:47.478145 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11209000-0x11209fff]: assigned Dec 16 12:23:47.478224 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff]: assigned Dec 16 12:23:47.478369 kernel: pci 0000:00:04.0: BAR 0 [io 0xa000-0xa007]: assigned Dec 16 12:23:47.478469 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Dec 16 12:23:47.478551 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Dec 16 12:23:47.478636 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Dec 16 12:23:47.478716 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Dec 16 12:23:47.478794 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Dec 16 12:23:47.478873 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Dec 16 12:23:47.478951 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Dec 16 12:23:47.479035 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Dec 16 12:23:47.479116 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Dec 16 12:23:47.479194 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Dec 16 12:23:47.480401 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Dec 16 12:23:47.480541 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Dec 16 12:23:47.480636 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Dec 16 12:23:47.480719 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Dec 16 12:23:47.480807 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Dec 16 12:23:47.480887 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Dec 16 12:23:47.480986 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Dec 16 12:23:47.481068 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Dec 16 12:23:47.481154 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Dec 16 12:23:47.481235 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Dec 16 12:23:47.481380 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Dec 16 12:23:47.481476 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Dec 16 12:23:47.481556 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Dec 16 12:23:47.481642 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Dec 16 12:23:47.481723 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Dec 16 12:23:47.481808 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Dec 16 12:23:47.481886 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Dec 16 12:23:47.481964 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Dec 16 12:23:47.482043 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Dec 16 12:23:47.482129 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Dec 16 12:23:47.482210 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Dec 16 12:23:47.483383 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Dec 16 12:23:47.483505 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Dec 16 12:23:47.483588 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Dec 16 12:23:47.483668 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Dec 16 12:23:47.483755 kernel: pci 0000:07:00.0: ROM [mem 0x10c00000-0x10c7ffff pref]: assigned Dec 16 12:23:47.483838 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000c00000-0x8000c03fff 64bit pref]: assigned Dec 16 12:23:47.483920 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10c80000-0x10c80fff]: assigned Dec 16 12:23:47.484002 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Dec 16 12:23:47.484084 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Dec 16 12:23:47.484164 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Dec 16 12:23:47.484246 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Dec 16 12:23:47.484412 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Dec 16 12:23:47.484499 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Dec 16 12:23:47.484580 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Dec 16 12:23:47.485497 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Dec 16 12:23:47.485602 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Dec 16 12:23:47.485685 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Dec 16 12:23:47.485766 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Dec 16 12:23:47.485862 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Dec 16 12:23:47.485944 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Dec 16 12:23:47.486019 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Dec 16 12:23:47.486096 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Dec 16 12:23:47.486183 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Dec 16 12:23:47.486261 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Dec 16 12:23:47.487207 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Dec 16 12:23:47.487356 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Dec 16 12:23:47.489380 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Dec 16 12:23:47.489474 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Dec 16 12:23:47.489563 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Dec 16 12:23:47.489638 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Dec 16 12:23:47.489715 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Dec 16 12:23:47.489797 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Dec 16 12:23:47.489877 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Dec 16 12:23:47.489950 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Dec 16 12:23:47.490032 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Dec 16 12:23:47.490106 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Dec 16 12:23:47.490180 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Dec 16 12:23:47.490261 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Dec 16 12:23:47.490469 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Dec 16 12:23:47.490545 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Dec 16 12:23:47.490632 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Dec 16 12:23:47.490708 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Dec 16 12:23:47.490782 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Dec 16 12:23:47.490867 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Dec 16 12:23:47.490941 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Dec 16 12:23:47.491015 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Dec 16 12:23:47.491095 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Dec 16 12:23:47.491170 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Dec 16 12:23:47.491243 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Dec 16 12:23:47.491255 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Dec 16 12:23:47.491264 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Dec 16 12:23:47.491295 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Dec 16 12:23:47.491314 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Dec 16 12:23:47.491323 kernel: iommu: Default domain type: Translated Dec 16 12:23:47.491331 kernel: iommu: DMA domain TLB invalidation policy: strict mode Dec 16 12:23:47.491339 kernel: efivars: Registered efivars operations Dec 16 12:23:47.491349 kernel: vgaarb: loaded Dec 16 12:23:47.491358 kernel: clocksource: Switched to clocksource arch_sys_counter Dec 16 12:23:47.491366 kernel: VFS: Disk quotas dquot_6.6.0 Dec 16 12:23:47.491375 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Dec 16 12:23:47.491383 kernel: pnp: PnP ACPI init Dec 16 12:23:47.491496 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Dec 16 12:23:47.491509 kernel: pnp: PnP ACPI: found 1 devices Dec 16 12:23:47.491518 kernel: NET: Registered PF_INET protocol family Dec 16 12:23:47.491526 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Dec 16 12:23:47.491535 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Dec 16 12:23:47.491544 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Dec 16 12:23:47.491552 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Dec 16 12:23:47.491560 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Dec 16 12:23:47.491569 kernel: TCP: Hash tables configured (established 32768 bind 32768) Dec 16 12:23:47.491577 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Dec 16 12:23:47.491586 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Dec 16 12:23:47.491594 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Dec 16 12:23:47.491686 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Dec 16 12:23:47.491697 kernel: PCI: CLS 0 bytes, default 64 Dec 16 12:23:47.491706 kernel: kvm [1]: HYP mode not available Dec 16 12:23:47.491715 kernel: Initialise system trusted keyrings Dec 16 12:23:47.491723 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Dec 16 12:23:47.491731 kernel: Key type asymmetric registered Dec 16 12:23:47.491739 kernel: Asymmetric key parser 'x509' registered Dec 16 12:23:47.491747 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Dec 16 12:23:47.491755 kernel: io scheduler mq-deadline registered Dec 16 12:23:47.491763 kernel: io scheduler kyber registered Dec 16 12:23:47.491771 kernel: io scheduler bfq registered Dec 16 12:23:47.491781 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Dec 16 12:23:47.491864 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Dec 16 12:23:47.491947 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Dec 16 12:23:47.492027 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:23:47.492566 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Dec 16 12:23:47.492664 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Dec 16 12:23:47.492750 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:23:47.492834 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Dec 16 12:23:47.492950 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Dec 16 12:23:47.493424 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:23:47.493536 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Dec 16 12:23:47.493620 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Dec 16 12:23:47.493706 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:23:47.493790 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Dec 16 12:23:47.493870 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Dec 16 12:23:47.493948 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:23:47.494029 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Dec 16 12:23:47.494108 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Dec 16 12:23:47.494187 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:23:47.494317 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Dec 16 12:23:47.497465 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Dec 16 12:23:47.497563 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:23:47.497650 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Dec 16 12:23:47.497734 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Dec 16 12:23:47.497815 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:23:47.497832 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Dec 16 12:23:47.497915 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Dec 16 12:23:47.497997 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Dec 16 12:23:47.498089 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Dec 16 12:23:47.498100 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Dec 16 12:23:47.498109 kernel: ACPI: button: Power Button [PWRB] Dec 16 12:23:47.498117 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Dec 16 12:23:47.498206 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Dec 16 12:23:47.498331 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Dec 16 12:23:47.498345 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Dec 16 12:23:47.498354 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Dec 16 12:23:47.498441 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Dec 16 12:23:47.498452 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Dec 16 12:23:47.498461 kernel: thunder_xcv, ver 1.0 Dec 16 12:23:47.498472 kernel: thunder_bgx, ver 1.0 Dec 16 12:23:47.498480 kernel: nicpf, ver 1.0 Dec 16 12:23:47.498488 kernel: nicvf, ver 1.0 Dec 16 12:23:47.498583 kernel: rtc-efi rtc-efi.0: registered as rtc0 Dec 16 12:23:47.498660 kernel: rtc-efi rtc-efi.0: setting system clock to 2025-12-16T12:23:46 UTC (1765887826) Dec 16 12:23:47.498671 kernel: hid: raw HID events driver (C) Jiri Kosina Dec 16 12:23:47.498681 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Dec 16 12:23:47.498690 kernel: watchdog: NMI not fully supported Dec 16 12:23:47.498698 kernel: watchdog: Hard watchdog permanently disabled Dec 16 12:23:47.498706 kernel: NET: Registered PF_INET6 protocol family Dec 16 12:23:47.498714 kernel: Segment Routing with IPv6 Dec 16 12:23:47.498722 kernel: In-situ OAM (IOAM) with IPv6 Dec 16 12:23:47.498730 kernel: NET: Registered PF_PACKET protocol family Dec 16 12:23:47.498739 kernel: Key type dns_resolver registered Dec 16 12:23:47.498747 kernel: registered taskstats version 1 Dec 16 12:23:47.498756 kernel: Loading compiled-in X.509 certificates Dec 16 12:23:47.498764 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.61-flatcar: a5d527f63342895c4af575176d4ae6e640b6d0e9' Dec 16 12:23:47.498772 kernel: Demotion targets for Node 0: null Dec 16 12:23:47.498780 kernel: Key type .fscrypt registered Dec 16 12:23:47.498788 kernel: Key type fscrypt-provisioning registered Dec 16 12:23:47.498797 kernel: ima: No TPM chip found, activating TPM-bypass! Dec 16 12:23:47.498805 kernel: ima: Allocated hash algorithm: sha1 Dec 16 12:23:47.498813 kernel: ima: No architecture policies found Dec 16 12:23:47.498822 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Dec 16 12:23:47.498830 kernel: clk: Disabling unused clocks Dec 16 12:23:47.498838 kernel: PM: genpd: Disabling unused power domains Dec 16 12:23:47.498846 kernel: Freeing unused kernel memory: 12416K Dec 16 12:23:47.498853 kernel: Run /init as init process Dec 16 12:23:47.498863 kernel: with arguments: Dec 16 12:23:47.498871 kernel: /init Dec 16 12:23:47.498878 kernel: with environment: Dec 16 12:23:47.498886 kernel: HOME=/ Dec 16 12:23:47.498894 kernel: TERM=linux Dec 16 12:23:47.498902 kernel: ACPI: bus type USB registered Dec 16 12:23:47.498911 kernel: usbcore: registered new interface driver usbfs Dec 16 12:23:47.498920 kernel: usbcore: registered new interface driver hub Dec 16 12:23:47.498928 kernel: usbcore: registered new device driver usb Dec 16 12:23:47.499014 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 16 12:23:47.499098 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Dec 16 12:23:47.499180 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Dec 16 12:23:47.499264 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Dec 16 12:23:47.500226 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Dec 16 12:23:47.500470 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Dec 16 12:23:47.500593 kernel: hub 1-0:1.0: USB hub found Dec 16 12:23:47.502191 kernel: hub 1-0:1.0: 4 ports detected Dec 16 12:23:47.502373 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Dec 16 12:23:47.502484 kernel: hub 2-0:1.0: USB hub found Dec 16 12:23:47.502580 kernel: hub 2-0:1.0: 4 ports detected Dec 16 12:23:47.502591 kernel: SCSI subsystem initialized Dec 16 12:23:47.502688 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Dec 16 12:23:47.502783 kernel: scsi host0: Virtio SCSI HBA Dec 16 12:23:47.502884 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Dec 16 12:23:47.502986 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Dec 16 12:23:47.503075 kernel: sd 0:0:0:1: Power-on or device reset occurred Dec 16 12:23:47.503165 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Dec 16 12:23:47.503252 kernel: sd 0:0:0:1: [sda] Write Protect is off Dec 16 12:23:47.505456 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Dec 16 12:23:47.505566 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Dec 16 12:23:47.505577 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Dec 16 12:23:47.505586 kernel: GPT:25804799 != 80003071 Dec 16 12:23:47.505594 kernel: GPT:Alternate GPT header not at the end of the disk. Dec 16 12:23:47.505602 kernel: GPT:25804799 != 80003071 Dec 16 12:23:47.505610 kernel: GPT: Use GNU Parted to correct GPT errors. Dec 16 12:23:47.505618 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Dec 16 12:23:47.505707 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Dec 16 12:23:47.505798 kernel: sr 0:0:0:0: Power-on or device reset occurred Dec 16 12:23:47.505885 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Dec 16 12:23:47.505895 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Dec 16 12:23:47.505979 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Dec 16 12:23:47.505990 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Dec 16 12:23:47.505999 kernel: device-mapper: uevent: version 1.0.3 Dec 16 12:23:47.506007 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Dec 16 12:23:47.506016 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Dec 16 12:23:47.506024 kernel: raid6: neonx8 gen() 15739 MB/s Dec 16 12:23:47.506032 kernel: raid6: neonx4 gen() 15536 MB/s Dec 16 12:23:47.506040 kernel: raid6: neonx2 gen() 9278 MB/s Dec 16 12:23:47.506047 kernel: raid6: neonx1 gen() 10437 MB/s Dec 16 12:23:47.506055 kernel: raid6: int64x8 gen() 6697 MB/s Dec 16 12:23:47.506065 kernel: raid6: int64x4 gen() 7308 MB/s Dec 16 12:23:47.506073 kernel: raid6: int64x2 gen() 6067 MB/s Dec 16 12:23:47.506178 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Dec 16 12:23:47.506191 kernel: raid6: int64x1 gen() 4990 MB/s Dec 16 12:23:47.506199 kernel: raid6: using algorithm neonx8 gen() 15739 MB/s Dec 16 12:23:47.506207 kernel: raid6: .... xor() 11972 MB/s, rmw enabled Dec 16 12:23:47.506217 kernel: raid6: using neon recovery algorithm Dec 16 12:23:47.506225 kernel: xor: measuring software checksum speed Dec 16 12:23:47.506232 kernel: 8regs : 16203 MB/sec Dec 16 12:23:47.506240 kernel: 32regs : 21704 MB/sec Dec 16 12:23:47.506248 kernel: arm64_neon : 28157 MB/sec Dec 16 12:23:47.506256 kernel: xor: using function: arm64_neon (28157 MB/sec) Dec 16 12:23:47.506264 kernel: Btrfs loaded, zoned=no, fsverity=no Dec 16 12:23:47.506286 kernel: BTRFS: device fsid d09b8b5a-fb5f-4a17-94ef-0a452535b2bc devid 1 transid 37 /dev/mapper/usr (254:0) scanned by mount (212) Dec 16 12:23:47.506296 kernel: BTRFS info (device dm-0): first mount of filesystem d09b8b5a-fb5f-4a17-94ef-0a452535b2bc Dec 16 12:23:47.506314 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Dec 16 12:23:47.506323 kernel: BTRFS info (device dm-0): enabling ssd optimizations Dec 16 12:23:47.506331 kernel: BTRFS info (device dm-0): disabling log replay at mount time Dec 16 12:23:47.506339 kernel: BTRFS info (device dm-0): enabling free space tree Dec 16 12:23:47.506347 kernel: loop: module loaded Dec 16 12:23:47.506357 kernel: loop0: detected capacity change from 0 to 91480 Dec 16 12:23:47.506365 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Dec 16 12:23:47.506472 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Dec 16 12:23:47.506486 systemd[1]: Successfully made /usr/ read-only. Dec 16 12:23:47.506497 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 16 12:23:47.506506 systemd[1]: Detected virtualization kvm. Dec 16 12:23:47.506516 systemd[1]: Detected architecture arm64. Dec 16 12:23:47.506524 systemd[1]: Running in initrd. Dec 16 12:23:47.506533 systemd[1]: No hostname configured, using default hostname. Dec 16 12:23:47.506542 systemd[1]: Hostname set to . Dec 16 12:23:47.506550 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Dec 16 12:23:47.506558 systemd[1]: Queued start job for default target initrd.target. Dec 16 12:23:47.506570 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Dec 16 12:23:47.506579 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 12:23:47.506587 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 12:23:47.506596 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Dec 16 12:23:47.506605 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 16 12:23:47.506614 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Dec 16 12:23:47.506625 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Dec 16 12:23:47.506634 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 12:23:47.506642 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 16 12:23:47.506651 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Dec 16 12:23:47.506659 systemd[1]: Reached target paths.target - Path Units. Dec 16 12:23:47.506667 systemd[1]: Reached target slices.target - Slice Units. Dec 16 12:23:47.506677 systemd[1]: Reached target swap.target - Swaps. Dec 16 12:23:47.506686 systemd[1]: Reached target timers.target - Timer Units. Dec 16 12:23:47.506694 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Dec 16 12:23:47.506703 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 16 12:23:47.506711 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Dec 16 12:23:47.506720 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Dec 16 12:23:47.506728 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Dec 16 12:23:47.506738 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 16 12:23:47.506747 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 16 12:23:47.506755 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 12:23:47.506763 systemd[1]: Reached target sockets.target - Socket Units. Dec 16 12:23:47.506772 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Dec 16 12:23:47.506780 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Dec 16 12:23:47.506789 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 16 12:23:47.506798 systemd[1]: Finished network-cleanup.service - Network Cleanup. Dec 16 12:23:47.506807 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Dec 16 12:23:47.506816 systemd[1]: Starting systemd-fsck-usr.service... Dec 16 12:23:47.506824 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 16 12:23:47.506832 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 16 12:23:47.506842 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:23:47.506851 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Dec 16 12:23:47.506860 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 12:23:47.506868 systemd[1]: Finished systemd-fsck-usr.service. Dec 16 12:23:47.506877 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Dec 16 12:23:47.506907 systemd-journald[349]: Collecting audit messages is enabled. Dec 16 12:23:47.506928 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Dec 16 12:23:47.506937 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Dec 16 12:23:47.506948 kernel: audit: type=1130 audit(1765887827.459:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:47.506956 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:23:47.506966 kernel: audit: type=1130 audit(1765887827.463:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:47.506974 kernel: Bridge firewalling registered Dec 16 12:23:47.506982 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Dec 16 12:23:47.506991 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 16 12:23:47.506999 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 16 12:23:47.507010 kernel: audit: type=1130 audit(1765887827.476:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:47.507018 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 16 12:23:47.507027 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 12:23:47.507035 kernel: audit: type=1130 audit(1765887827.503:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:47.507045 systemd-journald[349]: Journal started Dec 16 12:23:47.507066 systemd-journald[349]: Runtime Journal (/run/log/journal/c8ccae30d5f7424ea73c7f339d676045) is 8M, max 76.5M, 68.5M free. Dec 16 12:23:47.459000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:47.463000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:47.476000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:47.503000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:47.467316 systemd-modules-load[351]: Inserted module 'br_netfilter' Dec 16 12:23:47.511687 systemd[1]: Started systemd-journald.service - Journal Service. Dec 16 12:23:47.511716 kernel: audit: type=1130 audit(1765887827.508:6): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:47.508000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:47.511298 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 16 12:23:47.511000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:47.514592 kernel: audit: type=1130 audit(1765887827.511:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:47.515290 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 16 12:23:47.517419 kernel: audit: type=1334 audit(1765887827.512:8): prog-id=6 op=LOAD Dec 16 12:23:47.512000 audit: BPF prog-id=6 op=LOAD Dec 16 12:23:47.518098 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 16 12:23:47.528518 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 16 12:23:47.533692 kernel: audit: type=1130 audit(1765887827.529:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:47.529000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:47.535409 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Dec 16 12:23:47.543986 systemd-tmpfiles[377]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Dec 16 12:23:47.550788 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 12:23:47.551000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:47.557334 kernel: audit: type=1130 audit(1765887827.551:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:47.568287 dracut-cmdline[391]: Using kernel command line parameters: rd.driver.pre=btrfs SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=f511955c7ec069359d088640c1194932d6d915b5bb2829e8afbb591f10cd0849 Dec 16 12:23:47.587120 systemd-resolved[376]: Positive Trust Anchors: Dec 16 12:23:47.588332 systemd-resolved[376]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 16 12:23:47.588338 systemd-resolved[376]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Dec 16 12:23:47.588379 systemd-resolved[376]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 16 12:23:47.615485 systemd-resolved[376]: Defaulting to hostname 'linux'. Dec 16 12:23:47.616669 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 16 12:23:47.617000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:47.617447 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 16 12:23:47.690322 kernel: Loading iSCSI transport class v2.0-870. Dec 16 12:23:47.700442 kernel: iscsi: registered transport (tcp) Dec 16 12:23:47.714574 kernel: iscsi: registered transport (qla4xxx) Dec 16 12:23:47.714701 kernel: QLogic iSCSI HBA Driver Dec 16 12:23:47.744082 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 16 12:23:47.769422 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 12:23:47.771000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:47.772832 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 16 12:23:47.826155 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Dec 16 12:23:47.826000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:47.829092 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Dec 16 12:23:47.830610 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Dec 16 12:23:47.882000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:47.883000 audit: BPF prog-id=7 op=LOAD Dec 16 12:23:47.883000 audit: BPF prog-id=8 op=LOAD Dec 16 12:23:47.882387 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Dec 16 12:23:47.886163 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 12:23:47.919632 systemd-udevd[629]: Using default interface naming scheme 'v257'. Dec 16 12:23:47.930963 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 12:23:47.931000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:47.934198 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Dec 16 12:23:47.970413 dracut-pre-trigger[694]: rd.md=0: removing MD RAID activation Dec 16 12:23:47.972312 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 16 12:23:47.972000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:47.973000 audit: BPF prog-id=9 op=LOAD Dec 16 12:23:47.974840 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 16 12:23:48.012043 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Dec 16 12:23:48.012000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:48.014126 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 16 12:23:48.031032 systemd-networkd[746]: lo: Link UP Dec 16 12:23:48.031733 systemd-networkd[746]: lo: Gained carrier Dec 16 12:23:48.032284 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 16 12:23:48.033000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:48.035413 systemd[1]: Reached target network.target - Network. Dec 16 12:23:48.086823 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 12:23:48.087000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:48.090873 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Dec 16 12:23:48.214340 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Dec 16 12:23:48.214399 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Dec 16 12:23:48.222199 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Dec 16 12:23:48.267027 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Dec 16 12:23:48.279325 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Dec 16 12:23:48.286475 kernel: usbcore: registered new interface driver usbhid Dec 16 12:23:48.286537 kernel: usbhid: USB HID core driver Dec 16 12:23:48.298625 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Dec 16 12:23:48.309925 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Dec 16 12:23:48.314060 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Dec 16 12:23:48.324046 systemd-networkd[746]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:23:48.326357 systemd-networkd[746]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 12:23:48.330218 systemd-networkd[746]: eth0: Link UP Dec 16 12:23:48.330524 systemd-networkd[746]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:23:48.330528 systemd-networkd[746]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 12:23:48.331531 systemd-networkd[746]: eth0: Gained carrier Dec 16 12:23:48.331542 systemd-networkd[746]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:23:48.339791 systemd-networkd[746]: eth1: Link UP Dec 16 12:23:48.341401 systemd-networkd[746]: eth1: Gained carrier Dec 16 12:23:48.341417 systemd-networkd[746]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:23:48.344218 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Dec 16 12:23:48.349466 disk-uuid[816]: Primary Header is updated. Dec 16 12:23:48.349466 disk-uuid[816]: Secondary Entries is updated. Dec 16 12:23:48.349466 disk-uuid[816]: Secondary Header is updated. Dec 16 12:23:48.353000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:48.349517 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 12:23:48.349762 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:23:48.353984 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:23:48.360565 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:23:48.392410 systemd-networkd[746]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Dec 16 12:23:48.398466 systemd-networkd[746]: eth0: DHCPv4 address 128.140.49.38/32, gateway 172.31.1.1 acquired from 172.31.1.1 Dec 16 12:23:48.404453 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:23:48.404000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:48.488321 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Dec 16 12:23:48.489000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:48.489944 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Dec 16 12:23:48.491995 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 12:23:48.494355 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 16 12:23:48.497958 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Dec 16 12:23:48.524234 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Dec 16 12:23:48.524000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:49.412532 disk-uuid[817]: Warning: The kernel is still using the old partition table. Dec 16 12:23:49.412532 disk-uuid[817]: The new table will be used at the next reboot or after you Dec 16 12:23:49.412532 disk-uuid[817]: run partprobe(8) or kpartx(8) Dec 16 12:23:49.412532 disk-uuid[817]: The operation has completed successfully. Dec 16 12:23:49.422966 systemd[1]: disk-uuid.service: Deactivated successfully. Dec 16 12:23:49.424000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:49.424000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:49.423125 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Dec 16 12:23:49.425739 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Dec 16 12:23:49.473306 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (849) Dec 16 12:23:49.474684 kernel: BTRFS info (device sda6): first mount of filesystem 006ba4f4-0786-4a38-abb9-900c84a8b97a Dec 16 12:23:49.474727 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Dec 16 12:23:49.478370 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 16 12:23:49.478423 kernel: BTRFS info (device sda6): turning on async discard Dec 16 12:23:49.478443 kernel: BTRFS info (device sda6): enabling free space tree Dec 16 12:23:49.485298 kernel: BTRFS info (device sda6): last unmount of filesystem 006ba4f4-0786-4a38-abb9-900c84a8b97a Dec 16 12:23:49.485904 systemd[1]: Finished ignition-setup.service - Ignition (setup). Dec 16 12:23:49.486000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:49.488617 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Dec 16 12:23:49.615487 ignition[868]: Ignition 2.22.0 Dec 16 12:23:49.615505 ignition[868]: Stage: fetch-offline Dec 16 12:23:49.615558 ignition[868]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:23:49.615567 ignition[868]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 12:23:49.615723 ignition[868]: parsed url from cmdline: "" Dec 16 12:23:49.615726 ignition[868]: no config URL provided Dec 16 12:23:49.615730 ignition[868]: reading system config file "/usr/lib/ignition/user.ign" Dec 16 12:23:49.615738 ignition[868]: no config at "/usr/lib/ignition/user.ign" Dec 16 12:23:49.622000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:49.620525 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Dec 16 12:23:49.615743 ignition[868]: failed to fetch config: resource requires networking Dec 16 12:23:49.623622 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Dec 16 12:23:49.615991 ignition[868]: Ignition finished successfully Dec 16 12:23:49.665155 ignition[875]: Ignition 2.22.0 Dec 16 12:23:49.665170 ignition[875]: Stage: fetch Dec 16 12:23:49.665374 ignition[875]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:23:49.665384 ignition[875]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 12:23:49.665464 ignition[875]: parsed url from cmdline: "" Dec 16 12:23:49.665467 ignition[875]: no config URL provided Dec 16 12:23:49.665472 ignition[875]: reading system config file "/usr/lib/ignition/user.ign" Dec 16 12:23:49.665478 ignition[875]: no config at "/usr/lib/ignition/user.ign" Dec 16 12:23:49.665508 ignition[875]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Dec 16 12:23:49.671147 ignition[875]: GET result: OK Dec 16 12:23:49.671287 ignition[875]: parsing config with SHA512: 0a9f31ad2273191ea16ec6e59d50bb94f5517c143323352749e5494c3b1d22e7c2529054b834dff5d21eef5e7e28d676d15f468523204dc383a4aeef043ee4b1 Dec 16 12:23:49.676142 unknown[875]: fetched base config from "system" Dec 16 12:23:49.676151 unknown[875]: fetched base config from "system" Dec 16 12:23:49.676619 ignition[875]: fetch: fetch complete Dec 16 12:23:49.676159 unknown[875]: fetched user config from "hetzner" Dec 16 12:23:49.676625 ignition[875]: fetch: fetch passed Dec 16 12:23:49.676675 ignition[875]: Ignition finished successfully Dec 16 12:23:49.680791 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Dec 16 12:23:49.681000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:49.684367 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Dec 16 12:23:49.721637 ignition[882]: Ignition 2.22.0 Dec 16 12:23:49.722235 ignition[882]: Stage: kargs Dec 16 12:23:49.722480 ignition[882]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:23:49.722489 ignition[882]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 12:23:49.724982 ignition[882]: kargs: kargs passed Dec 16 12:23:49.725483 ignition[882]: Ignition finished successfully Dec 16 12:23:49.727874 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Dec 16 12:23:49.728000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:49.730166 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Dec 16 12:23:49.761121 ignition[889]: Ignition 2.22.0 Dec 16 12:23:49.761135 ignition[889]: Stage: disks Dec 16 12:23:49.762687 ignition[889]: no configs at "/usr/lib/ignition/base.d" Dec 16 12:23:49.762703 ignition[889]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 12:23:49.765311 ignition[889]: disks: disks passed Dec 16 12:23:49.765372 ignition[889]: Ignition finished successfully Dec 16 12:23:49.770000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:49.769351 systemd[1]: Finished ignition-disks.service - Ignition (disks). Dec 16 12:23:49.770991 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Dec 16 12:23:49.772497 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Dec 16 12:23:49.774416 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 16 12:23:49.775544 systemd[1]: Reached target sysinit.target - System Initialization. Dec 16 12:23:49.776648 systemd[1]: Reached target basic.target - Basic System. Dec 16 12:23:49.778723 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Dec 16 12:23:49.833306 systemd-fsck[898]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Dec 16 12:23:49.838569 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Dec 16 12:23:49.842000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:49.844193 systemd[1]: Mounting sysroot.mount - /sysroot... Dec 16 12:23:49.870539 systemd-networkd[746]: eth1: Gained IPv6LL Dec 16 12:23:49.930354 kernel: EXT4-fs (sda9): mounted filesystem fa93fc03-2e23-46f9-9013-1e396e3304a8 r/w with ordered data mode. Quota mode: none. Dec 16 12:23:49.931352 systemd[1]: Mounted sysroot.mount - /sysroot. Dec 16 12:23:49.932548 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Dec 16 12:23:49.936045 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 16 12:23:49.938130 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Dec 16 12:23:49.941321 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Dec 16 12:23:49.941938 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Dec 16 12:23:49.941973 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Dec 16 12:23:49.960827 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Dec 16 12:23:49.962549 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Dec 16 12:23:49.974398 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (906) Dec 16 12:23:49.979028 kernel: BTRFS info (device sda6): first mount of filesystem 006ba4f4-0786-4a38-abb9-900c84a8b97a Dec 16 12:23:49.979096 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Dec 16 12:23:49.996625 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 16 12:23:49.996696 kernel: BTRFS info (device sda6): turning on async discard Dec 16 12:23:49.997552 kernel: BTRFS info (device sda6): enabling free space tree Dec 16 12:23:50.003059 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 16 12:23:50.021523 coreos-metadata[908]: Dec 16 12:23:50.021 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Dec 16 12:23:50.026438 coreos-metadata[908]: Dec 16 12:23:50.025 INFO Fetch successful Dec 16 12:23:50.027542 coreos-metadata[908]: Dec 16 12:23:50.027 INFO wrote hostname ci-4515-1-0-6-95bdd2e3e7 to /sysroot/etc/hostname Dec 16 12:23:50.032163 initrd-setup-root[934]: cut: /sysroot/etc/passwd: No such file or directory Dec 16 12:23:50.033673 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Dec 16 12:23:50.035000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:50.039433 initrd-setup-root[942]: cut: /sysroot/etc/group: No such file or directory Dec 16 12:23:50.045641 initrd-setup-root[949]: cut: /sysroot/etc/shadow: No such file or directory Dec 16 12:23:50.050160 initrd-setup-root[956]: cut: /sysroot/etc/gshadow: No such file or directory Dec 16 12:23:50.160397 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Dec 16 12:23:50.161000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:50.164795 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Dec 16 12:23:50.167445 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Dec 16 12:23:50.185301 kernel: BTRFS info (device sda6): last unmount of filesystem 006ba4f4-0786-4a38-abb9-900c84a8b97a Dec 16 12:23:50.209622 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Dec 16 12:23:50.210000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:50.219802 ignition[1024]: INFO : Ignition 2.22.0 Dec 16 12:23:50.219802 ignition[1024]: INFO : Stage: mount Dec 16 12:23:50.221671 ignition[1024]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 12:23:50.221671 ignition[1024]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 12:23:50.221671 ignition[1024]: INFO : mount: mount passed Dec 16 12:23:50.221671 ignition[1024]: INFO : Ignition finished successfully Dec 16 12:23:50.228248 systemd[1]: Finished ignition-mount.service - Ignition (mount). Dec 16 12:23:50.229000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:50.232776 systemd[1]: Starting ignition-files.service - Ignition (files)... Dec 16 12:23:50.254471 systemd-networkd[746]: eth0: Gained IPv6LL Dec 16 12:23:50.462453 systemd[1]: sysroot-oem.mount: Deactivated successfully. Dec 16 12:23:50.464770 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Dec 16 12:23:50.497321 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1035) Dec 16 12:23:50.500944 kernel: BTRFS info (device sda6): first mount of filesystem 006ba4f4-0786-4a38-abb9-900c84a8b97a Dec 16 12:23:50.501311 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Dec 16 12:23:50.505574 kernel: BTRFS info (device sda6): enabling ssd optimizations Dec 16 12:23:50.505648 kernel: BTRFS info (device sda6): turning on async discard Dec 16 12:23:50.505668 kernel: BTRFS info (device sda6): enabling free space tree Dec 16 12:23:50.509119 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Dec 16 12:23:50.544283 ignition[1052]: INFO : Ignition 2.22.0 Dec 16 12:23:50.544283 ignition[1052]: INFO : Stage: files Dec 16 12:23:50.546159 ignition[1052]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 12:23:50.546159 ignition[1052]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 12:23:50.546159 ignition[1052]: DEBUG : files: compiled without relabeling support, skipping Dec 16 12:23:50.549409 ignition[1052]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Dec 16 12:23:50.549409 ignition[1052]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Dec 16 12:23:50.555567 ignition[1052]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Dec 16 12:23:50.557649 ignition[1052]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Dec 16 12:23:50.559791 ignition[1052]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Dec 16 12:23:50.558678 unknown[1052]: wrote ssh authorized keys file for user: core Dec 16 12:23:50.562584 ignition[1052]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Dec 16 12:23:50.562584 ignition[1052]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Dec 16 12:23:50.822445 ignition[1052]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Dec 16 12:23:50.928859 ignition[1052]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Dec 16 12:23:50.928859 ignition[1052]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Dec 16 12:23:50.928859 ignition[1052]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Dec 16 12:23:50.928859 ignition[1052]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Dec 16 12:23:50.928859 ignition[1052]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Dec 16 12:23:50.928859 ignition[1052]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 16 12:23:50.928859 ignition[1052]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Dec 16 12:23:50.928859 ignition[1052]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 16 12:23:50.928859 ignition[1052]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Dec 16 12:23:50.940273 ignition[1052]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Dec 16 12:23:50.940273 ignition[1052]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Dec 16 12:23:50.940273 ignition[1052]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Dec 16 12:23:50.940273 ignition[1052]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Dec 16 12:23:50.940273 ignition[1052]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Dec 16 12:23:50.940273 ignition[1052]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.1-arm64.raw: attempt #1 Dec 16 12:23:51.086503 ignition[1052]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Dec 16 12:23:51.630667 ignition[1052]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Dec 16 12:23:51.630667 ignition[1052]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Dec 16 12:23:51.634340 ignition[1052]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 16 12:23:51.634340 ignition[1052]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Dec 16 12:23:51.634340 ignition[1052]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Dec 16 12:23:51.634340 ignition[1052]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Dec 16 12:23:51.634340 ignition[1052]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Dec 16 12:23:51.634340 ignition[1052]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Dec 16 12:23:51.634340 ignition[1052]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Dec 16 12:23:51.634340 ignition[1052]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Dec 16 12:23:51.634340 ignition[1052]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Dec 16 12:23:51.634340 ignition[1052]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Dec 16 12:23:51.634340 ignition[1052]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Dec 16 12:23:51.634340 ignition[1052]: INFO : files: files passed Dec 16 12:23:51.634340 ignition[1052]: INFO : Ignition finished successfully Dec 16 12:23:51.653051 kernel: kauditd_printk_skb: 28 callbacks suppressed Dec 16 12:23:51.653083 kernel: audit: type=1130 audit(1765887831.637:39): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.637000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.636805 systemd[1]: Finished ignition-files.service - Ignition (files). Dec 16 12:23:51.639067 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Dec 16 12:23:51.652936 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Dec 16 12:23:51.658741 systemd[1]: ignition-quench.service: Deactivated successfully. Dec 16 12:23:51.659752 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Dec 16 12:23:51.660000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.660000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.664358 kernel: audit: type=1130 audit(1765887831.660:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.664402 kernel: audit: type=1131 audit(1765887831.660:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.676040 initrd-setup-root-after-ignition[1083]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 16 12:23:51.676040 initrd-setup-root-after-ignition[1083]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Dec 16 12:23:51.678752 initrd-setup-root-after-ignition[1087]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Dec 16 12:23:51.680684 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 16 12:23:51.681586 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Dec 16 12:23:51.681000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.684921 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Dec 16 12:23:51.687053 kernel: audit: type=1130 audit(1765887831.681:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.748160 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Dec 16 12:23:51.749496 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Dec 16 12:23:51.750000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.750000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.751796 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Dec 16 12:23:51.757780 kernel: audit: type=1130 audit(1765887831.750:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.757812 kernel: audit: type=1131 audit(1765887831.750:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.756850 systemd[1]: Reached target initrd.target - Initrd Default Target. Dec 16 12:23:51.758459 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Dec 16 12:23:51.759645 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Dec 16 12:23:51.786525 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 16 12:23:51.787000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.792172 kernel: audit: type=1130 audit(1765887831.787:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.791418 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Dec 16 12:23:51.818154 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Dec 16 12:23:51.819136 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Dec 16 12:23:51.820740 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 12:23:51.821542 systemd[1]: Stopped target timers.target - Timer Units. Dec 16 12:23:51.823355 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Dec 16 12:23:51.824000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.823487 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Dec 16 12:23:51.827509 systemd[1]: Stopped target initrd.target - Initrd Default Target. Dec 16 12:23:51.829397 kernel: audit: type=1131 audit(1765887831.824:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.828868 systemd[1]: Stopped target basic.target - Basic System. Dec 16 12:23:51.830585 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Dec 16 12:23:51.832044 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Dec 16 12:23:51.832867 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Dec 16 12:23:51.834447 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Dec 16 12:23:51.835951 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Dec 16 12:23:51.837185 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Dec 16 12:23:51.838673 systemd[1]: Stopped target sysinit.target - System Initialization. Dec 16 12:23:51.839905 systemd[1]: Stopped target local-fs.target - Local File Systems. Dec 16 12:23:51.840940 systemd[1]: Stopped target swap.target - Swaps. Dec 16 12:23:51.841846 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Dec 16 12:23:51.841982 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Dec 16 12:23:51.842000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.843446 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Dec 16 12:23:51.846584 kernel: audit: type=1131 audit(1765887831.842:47): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.846071 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 12:23:51.847194 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Dec 16 12:23:51.847292 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 12:23:51.848596 systemd[1]: dracut-initqueue.service: Deactivated successfully. Dec 16 12:23:51.849000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.848717 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Dec 16 12:23:51.853767 kernel: audit: type=1131 audit(1765887831.849:48): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.852000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.850509 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Dec 16 12:23:51.854000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.850632 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Dec 16 12:23:51.855000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.853349 systemd[1]: ignition-files.service: Deactivated successfully. Dec 16 12:23:51.853458 systemd[1]: Stopped ignition-files.service - Ignition (files). Dec 16 12:23:51.854424 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Dec 16 12:23:51.854521 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Dec 16 12:23:51.856637 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Dec 16 12:23:51.859134 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Dec 16 12:23:51.859304 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 12:23:51.862000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.865483 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Dec 16 12:23:51.867017 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Dec 16 12:23:51.867234 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 12:23:51.870887 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Dec 16 12:23:51.870000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.871749 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 12:23:51.872000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.873354 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Dec 16 12:23:51.874094 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Dec 16 12:23:51.875000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.885606 systemd[1]: initrd-cleanup.service: Deactivated successfully. Dec 16 12:23:51.886455 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Dec 16 12:23:51.888000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.888000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.894180 systemd[1]: sysroot-boot.mount: Deactivated successfully. Dec 16 12:23:51.900149 ignition[1107]: INFO : Ignition 2.22.0 Dec 16 12:23:51.900149 ignition[1107]: INFO : Stage: umount Dec 16 12:23:51.902930 ignition[1107]: INFO : no configs at "/usr/lib/ignition/base.d" Dec 16 12:23:51.902930 ignition[1107]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Dec 16 12:23:51.902930 ignition[1107]: INFO : umount: umount passed Dec 16 12:23:51.902930 ignition[1107]: INFO : Ignition finished successfully Dec 16 12:23:51.904000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.907000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.904412 systemd[1]: ignition-mount.service: Deactivated successfully. Dec 16 12:23:51.913000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.904509 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Dec 16 12:23:51.915000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.905588 systemd[1]: ignition-disks.service: Deactivated successfully. Dec 16 12:23:51.905689 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Dec 16 12:23:51.918000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.907479 systemd[1]: ignition-kargs.service: Deactivated successfully. Dec 16 12:23:51.907529 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Dec 16 12:23:51.914373 systemd[1]: ignition-fetch.service: Deactivated successfully. Dec 16 12:23:51.914447 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Dec 16 12:23:51.916448 systemd[1]: Stopped target network.target - Network. Dec 16 12:23:51.917574 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Dec 16 12:23:51.917640 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Dec 16 12:23:51.918980 systemd[1]: Stopped target paths.target - Path Units. Dec 16 12:23:51.920452 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Dec 16 12:23:51.927306 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 12:23:51.932889 systemd[1]: Stopped target slices.target - Slice Units. Dec 16 12:23:51.934978 systemd[1]: Stopped target sockets.target - Socket Units. Dec 16 12:23:51.940000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.936836 systemd[1]: iscsid.socket: Deactivated successfully. Dec 16 12:23:51.941000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.936924 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Dec 16 12:23:51.938012 systemd[1]: iscsiuio.socket: Deactivated successfully. Dec 16 12:23:51.938047 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Dec 16 12:23:51.939023 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Dec 16 12:23:51.939050 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Dec 16 12:23:51.940370 systemd[1]: ignition-setup.service: Deactivated successfully. Dec 16 12:23:51.946000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.940428 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Dec 16 12:23:51.941302 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Dec 16 12:23:51.941345 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Dec 16 12:23:51.942397 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Dec 16 12:23:51.949000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.943292 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Dec 16 12:23:51.946055 systemd[1]: sysroot-boot.service: Deactivated successfully. Dec 16 12:23:51.946145 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Dec 16 12:23:51.947948 systemd[1]: initrd-setup-root.service: Deactivated successfully. Dec 16 12:23:51.948018 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Dec 16 12:23:51.954160 systemd[1]: systemd-resolved.service: Deactivated successfully. Dec 16 12:23:51.955488 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Dec 16 12:23:51.957000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.958000 audit: BPF prog-id=6 op=UNLOAD Dec 16 12:23:51.959163 systemd[1]: systemd-networkd.service: Deactivated successfully. Dec 16 12:23:51.959315 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Dec 16 12:23:51.960000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.964000 audit: BPF prog-id=9 op=UNLOAD Dec 16 12:23:51.964554 systemd[1]: Stopped target network-pre.target - Preparation for Network. Dec 16 12:23:51.965203 systemd[1]: systemd-networkd.socket: Deactivated successfully. Dec 16 12:23:51.965624 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Dec 16 12:23:51.967526 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Dec 16 12:23:51.971000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.972000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.971331 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Dec 16 12:23:51.973000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.971412 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Dec 16 12:23:51.972174 systemd[1]: systemd-sysctl.service: Deactivated successfully. Dec 16 12:23:51.972216 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Dec 16 12:23:51.972944 systemd[1]: systemd-modules-load.service: Deactivated successfully. Dec 16 12:23:51.972986 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Dec 16 12:23:51.973797 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 12:23:51.993167 systemd[1]: systemd-udevd.service: Deactivated successfully. Dec 16 12:23:51.995612 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 12:23:51.996000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.998828 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Dec 16 12:23:51.998911 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Dec 16 12:23:52.001000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:51.999770 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Dec 16 12:23:51.999806 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 12:23:52.000458 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Dec 16 12:23:52.000509 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Dec 16 12:23:52.002461 systemd[1]: dracut-cmdline.service: Deactivated successfully. Dec 16 12:23:52.002514 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Dec 16 12:23:52.006000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:52.007122 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Dec 16 12:23:52.008000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:52.007186 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Dec 16 12:23:52.011145 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Dec 16 12:23:52.013402 systemd[1]: systemd-network-generator.service: Deactivated successfully. Dec 16 12:23:52.017000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:52.013512 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 12:23:52.018143 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Dec 16 12:23:52.018207 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 12:23:52.021000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:52.021648 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 12:23:52.021710 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:23:52.022000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:52.023790 systemd[1]: network-cleanup.service: Deactivated successfully. Dec 16 12:23:52.025394 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Dec 16 12:23:52.025000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:52.032720 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Dec 16 12:23:52.032851 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Dec 16 12:23:52.033000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:52.033000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:52.034311 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Dec 16 12:23:52.036738 systemd[1]: Starting initrd-switch-root.service - Switch Root... Dec 16 12:23:52.061469 systemd[1]: Switching root. Dec 16 12:23:52.105050 systemd-journald[349]: Journal stopped Dec 16 12:23:53.098557 systemd-journald[349]: Received SIGTERM from PID 1 (systemd). Dec 16 12:23:53.098625 kernel: SELinux: policy capability network_peer_controls=1 Dec 16 12:23:53.098640 kernel: SELinux: policy capability open_perms=1 Dec 16 12:23:53.098653 kernel: SELinux: policy capability extended_socket_class=1 Dec 16 12:23:53.098666 kernel: SELinux: policy capability always_check_network=0 Dec 16 12:23:53.098675 kernel: SELinux: policy capability cgroup_seclabel=1 Dec 16 12:23:53.098685 kernel: SELinux: policy capability nnp_nosuid_transition=1 Dec 16 12:23:53.098695 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Dec 16 12:23:53.098708 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Dec 16 12:23:53.098718 kernel: SELinux: policy capability userspace_initial_context=0 Dec 16 12:23:53.098729 systemd[1]: Successfully loaded SELinux policy in 52.096ms. Dec 16 12:23:53.098754 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.404ms. Dec 16 12:23:53.098766 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Dec 16 12:23:53.098777 systemd[1]: Detected virtualization kvm. Dec 16 12:23:53.098788 systemd[1]: Detected architecture arm64. Dec 16 12:23:53.098802 systemd[1]: Detected first boot. Dec 16 12:23:53.098812 systemd[1]: Hostname set to . Dec 16 12:23:53.098825 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Dec 16 12:23:53.098835 zram_generator::config[1152]: No configuration found. Dec 16 12:23:53.098850 kernel: NET: Registered PF_VSOCK protocol family Dec 16 12:23:53.098863 systemd[1]: Populated /etc with preset unit settings. Dec 16 12:23:53.098874 systemd[1]: initrd-switch-root.service: Deactivated successfully. Dec 16 12:23:53.098884 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Dec 16 12:23:53.098896 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Dec 16 12:23:53.098907 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Dec 16 12:23:53.098919 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Dec 16 12:23:53.098929 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Dec 16 12:23:53.098940 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Dec 16 12:23:53.098950 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Dec 16 12:23:53.098962 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Dec 16 12:23:53.098974 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Dec 16 12:23:53.098985 systemd[1]: Created slice user.slice - User and Session Slice. Dec 16 12:23:53.098996 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Dec 16 12:23:53.099007 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Dec 16 12:23:53.099018 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Dec 16 12:23:53.099029 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Dec 16 12:23:53.099041 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Dec 16 12:23:53.099053 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Dec 16 12:23:53.099063 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Dec 16 12:23:53.099074 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Dec 16 12:23:53.099085 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Dec 16 12:23:53.099096 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Dec 16 12:23:53.099108 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Dec 16 12:23:53.099118 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Dec 16 12:23:53.099129 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Dec 16 12:23:53.099139 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Dec 16 12:23:53.099150 systemd[1]: Reached target remote-fs.target - Remote File Systems. Dec 16 12:23:53.099161 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Dec 16 12:23:53.099172 systemd[1]: Reached target slices.target - Slice Units. Dec 16 12:23:53.099184 systemd[1]: Reached target swap.target - Swaps. Dec 16 12:23:53.099195 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Dec 16 12:23:53.099240 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Dec 16 12:23:53.099252 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Dec 16 12:23:53.102315 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Dec 16 12:23:53.102368 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Dec 16 12:23:53.102381 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Dec 16 12:23:53.102402 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Dec 16 12:23:53.102413 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Dec 16 12:23:53.102424 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Dec 16 12:23:53.102435 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Dec 16 12:23:53.102446 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Dec 16 12:23:53.102456 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Dec 16 12:23:53.102467 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Dec 16 12:23:53.102479 systemd[1]: Mounting media.mount - External Media Directory... Dec 16 12:23:53.102490 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Dec 16 12:23:53.102501 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Dec 16 12:23:53.102513 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Dec 16 12:23:53.102525 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Dec 16 12:23:53.102536 systemd[1]: Reached target machines.target - Containers. Dec 16 12:23:53.102547 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Dec 16 12:23:53.102560 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 12:23:53.102571 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Dec 16 12:23:53.102582 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Dec 16 12:23:53.102593 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 12:23:53.102604 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 16 12:23:53.102616 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 12:23:53.102626 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Dec 16 12:23:53.102639 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 12:23:53.102650 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Dec 16 12:23:53.102663 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Dec 16 12:23:53.102674 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Dec 16 12:23:53.102686 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Dec 16 12:23:53.102696 systemd[1]: Stopped systemd-fsck-usr.service. Dec 16 12:23:53.102712 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 12:23:53.102724 systemd[1]: Starting systemd-journald.service - Journal Service... Dec 16 12:23:53.102737 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Dec 16 12:23:53.102748 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Dec 16 12:23:53.102761 kernel: fuse: init (API version 7.41) Dec 16 12:23:53.102773 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Dec 16 12:23:53.102784 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Dec 16 12:23:53.102795 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Dec 16 12:23:53.102806 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Dec 16 12:23:53.102817 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Dec 16 12:23:53.102830 systemd[1]: Mounted media.mount - External Media Directory. Dec 16 12:23:53.102842 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Dec 16 12:23:53.102853 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Dec 16 12:23:53.102864 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Dec 16 12:23:53.102876 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Dec 16 12:23:53.102889 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Dec 16 12:23:53.102900 kernel: ACPI: bus type drm_connector registered Dec 16 12:23:53.102910 systemd[1]: modprobe@configfs.service: Deactivated successfully. Dec 16 12:23:53.102921 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Dec 16 12:23:53.102932 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 12:23:53.102942 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 12:23:53.102955 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 16 12:23:53.102968 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 16 12:23:53.102979 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 12:23:53.102990 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 12:23:53.103000 systemd[1]: modprobe@fuse.service: Deactivated successfully. Dec 16 12:23:53.103011 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Dec 16 12:23:53.103022 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 12:23:53.103032 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 12:23:53.103044 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Dec 16 12:23:53.103056 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Dec 16 12:23:53.103067 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Dec 16 12:23:53.103078 systemd[1]: Reached target network-pre.target - Preparation for Network. Dec 16 12:23:53.103089 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 16 12:23:53.103100 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Dec 16 12:23:53.103112 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Dec 16 12:23:53.103124 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Dec 16 12:23:53.103136 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Dec 16 12:23:53.103147 systemd[1]: Reached target local-fs.target - Local File Systems. Dec 16 12:23:53.103158 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Dec 16 12:23:53.103170 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 12:23:53.103181 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Dec 16 12:23:53.103192 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Dec 16 12:23:53.103218 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 16 12:23:53.105216 systemd-journald[1228]: Collecting audit messages is enabled. Dec 16 12:23:53.105258 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Dec 16 12:23:53.105917 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Dec 16 12:23:53.105942 systemd-journald[1228]: Journal started Dec 16 12:23:53.105978 systemd-journald[1228]: Runtime Journal (/run/log/journal/c8ccae30d5f7424ea73c7f339d676045) is 8M, max 76.5M, 68.5M free. Dec 16 12:23:52.938000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:52.940000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:52.943000 audit: BPF prog-id=14 op=UNLOAD Dec 16 12:23:52.944000 audit: BPF prog-id=13 op=UNLOAD Dec 16 12:23:52.944000 audit: BPF prog-id=15 op=LOAD Dec 16 12:23:52.945000 audit: BPF prog-id=16 op=LOAD Dec 16 12:23:52.945000 audit: BPF prog-id=17 op=LOAD Dec 16 12:23:53.011000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.014000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.020000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.020000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.027000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.028000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.034000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.034000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.039000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.039000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.045000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.045000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.048000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.048000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.052000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.054000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.059000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.069000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.093000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Dec 16 12:23:53.093000 audit[1228]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=60 a0=4 a1=ffffe8ae9150 a2=4000 a3=0 items=0 ppid=1 pid=1228 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:23:53.093000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Dec 16 12:23:52.729697 systemd[1]: Queued start job for default target multi-user.target. Dec 16 12:23:53.113364 systemd[1]: Starting systemd-sysusers.service - Create System Users... Dec 16 12:23:52.754482 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Dec 16 12:23:52.755352 systemd[1]: systemd-journald.service: Deactivated successfully. Dec 16 12:23:53.117189 systemd[1]: Started systemd-journald.service - Journal Service. Dec 16 12:23:53.116000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.121155 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Dec 16 12:23:53.122000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.135000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.135319 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Dec 16 12:23:53.142464 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Dec 16 12:23:53.151596 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Dec 16 12:23:53.154294 kernel: loop1: detected capacity change from 0 to 100192 Dec 16 12:23:53.156687 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Dec 16 12:23:53.159340 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Dec 16 12:23:53.160000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.187309 kernel: loop2: detected capacity change from 0 to 109872 Dec 16 12:23:53.192412 systemd-journald[1228]: Time spent on flushing to /var/log/journal/c8ccae30d5f7424ea73c7f339d676045 is 54.017ms for 1299 entries. Dec 16 12:23:53.192412 systemd-journald[1228]: System Journal (/var/log/journal/c8ccae30d5f7424ea73c7f339d676045) is 8M, max 588.1M, 580.1M free. Dec 16 12:23:53.258511 systemd-journald[1228]: Received client request to flush runtime journal. Dec 16 12:23:53.258574 kernel: loop3: detected capacity change from 0 to 200800 Dec 16 12:23:53.198000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.200000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.206000 audit: BPF prog-id=18 op=LOAD Dec 16 12:23:53.208000 audit: BPF prog-id=19 op=LOAD Dec 16 12:23:53.208000 audit: BPF prog-id=20 op=LOAD Dec 16 12:23:53.212000 audit: BPF prog-id=21 op=LOAD Dec 16 12:23:53.228000 audit: BPF prog-id=22 op=LOAD Dec 16 12:23:53.228000 audit: BPF prog-id=23 op=LOAD Dec 16 12:23:53.228000 audit: BPF prog-id=24 op=LOAD Dec 16 12:23:53.232000 audit: BPF prog-id=25 op=LOAD Dec 16 12:23:53.232000 audit: BPF prog-id=26 op=LOAD Dec 16 12:23:53.232000 audit: BPF prog-id=27 op=LOAD Dec 16 12:23:53.196558 systemd[1]: Finished systemd-sysusers.service - Create System Users. Dec 16 12:23:53.199529 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Dec 16 12:23:53.211448 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Dec 16 12:23:53.261000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.214482 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Dec 16 12:23:53.219794 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Dec 16 12:23:53.229357 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Dec 16 12:23:53.237502 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Dec 16 12:23:53.260782 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Dec 16 12:23:53.278703 systemd-tmpfiles[1285]: ACLs are not supported, ignoring. Dec 16 12:23:53.278720 systemd-tmpfiles[1285]: ACLs are not supported, ignoring. Dec 16 12:23:53.285288 kernel: loop4: detected capacity change from 0 to 8 Dec 16 12:23:53.290432 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Dec 16 12:23:53.292000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.305008 systemd[1]: Started systemd-userdbd.service - User Database Manager. Dec 16 12:23:53.308000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.310324 kernel: loop5: detected capacity change from 0 to 100192 Dec 16 12:23:53.318483 systemd-nsresourced[1287]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Dec 16 12:23:53.319714 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Dec 16 12:23:53.320000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.331605 kernel: loop6: detected capacity change from 0 to 109872 Dec 16 12:23:53.354218 kernel: loop7: detected capacity change from 0 to 200800 Dec 16 12:23:53.377308 kernel: loop1: detected capacity change from 0 to 8 Dec 16 12:23:53.380707 (sd-merge)[1298]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-hetzner.raw'. Dec 16 12:23:53.386830 systemd-oomd[1283]: No swap; memory pressure usage will be degraded Dec 16 12:23:53.388000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.387660 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Dec 16 12:23:53.390237 (sd-merge)[1298]: Merged extensions into '/usr'. Dec 16 12:23:53.398846 systemd[1]: Reload requested from client PID 1252 ('systemd-sysext') (unit systemd-sysext.service)... Dec 16 12:23:53.398868 systemd[1]: Reloading... Dec 16 12:23:53.420040 systemd-resolved[1284]: Positive Trust Anchors: Dec 16 12:23:53.420390 systemd-resolved[1284]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Dec 16 12:23:53.420464 systemd-resolved[1284]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Dec 16 12:23:53.420535 systemd-resolved[1284]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Dec 16 12:23:53.428781 systemd-resolved[1284]: Using system hostname 'ci-4515-1-0-6-95bdd2e3e7'. Dec 16 12:23:53.535288 zram_generator::config[1338]: No configuration found. Dec 16 12:23:53.712127 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Dec 16 12:23:53.712388 systemd[1]: Reloading finished in 313 ms. Dec 16 12:23:53.726430 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Dec 16 12:23:53.727000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.728713 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Dec 16 12:23:53.732000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:53.735741 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Dec 16 12:23:53.745572 systemd[1]: Starting ensure-sysext.service... Dec 16 12:23:53.751696 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Dec 16 12:23:53.754000 audit: BPF prog-id=28 op=LOAD Dec 16 12:23:53.754000 audit: BPF prog-id=15 op=UNLOAD Dec 16 12:23:53.754000 audit: BPF prog-id=29 op=LOAD Dec 16 12:23:53.754000 audit: BPF prog-id=30 op=LOAD Dec 16 12:23:53.754000 audit: BPF prog-id=16 op=UNLOAD Dec 16 12:23:53.754000 audit: BPF prog-id=17 op=UNLOAD Dec 16 12:23:53.755000 audit: BPF prog-id=31 op=LOAD Dec 16 12:23:53.757000 audit: BPF prog-id=18 op=UNLOAD Dec 16 12:23:53.757000 audit: BPF prog-id=32 op=LOAD Dec 16 12:23:53.757000 audit: BPF prog-id=33 op=LOAD Dec 16 12:23:53.757000 audit: BPF prog-id=19 op=UNLOAD Dec 16 12:23:53.757000 audit: BPF prog-id=20 op=UNLOAD Dec 16 12:23:53.757000 audit: BPF prog-id=34 op=LOAD Dec 16 12:23:53.757000 audit: BPF prog-id=25 op=UNLOAD Dec 16 12:23:53.757000 audit: BPF prog-id=35 op=LOAD Dec 16 12:23:53.758000 audit: BPF prog-id=36 op=LOAD Dec 16 12:23:53.758000 audit: BPF prog-id=26 op=UNLOAD Dec 16 12:23:53.758000 audit: BPF prog-id=27 op=UNLOAD Dec 16 12:23:53.759000 audit: BPF prog-id=37 op=LOAD Dec 16 12:23:53.759000 audit: BPF prog-id=21 op=UNLOAD Dec 16 12:23:53.760000 audit: BPF prog-id=38 op=LOAD Dec 16 12:23:53.761000 audit: BPF prog-id=22 op=UNLOAD Dec 16 12:23:53.761000 audit: BPF prog-id=39 op=LOAD Dec 16 12:23:53.761000 audit: BPF prog-id=40 op=LOAD Dec 16 12:23:53.761000 audit: BPF prog-id=23 op=UNLOAD Dec 16 12:23:53.761000 audit: BPF prog-id=24 op=UNLOAD Dec 16 12:23:53.777470 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Dec 16 12:23:53.783536 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Dec 16 12:23:53.796200 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Dec 16 12:23:53.798629 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Dec 16 12:23:53.805565 systemd[1]: Reload requested from client PID 1374 ('systemctl') (unit ensure-sysext.service)... Dec 16 12:23:53.805594 systemd[1]: Reloading... Dec 16 12:23:53.815636 systemd-tmpfiles[1375]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Dec 16 12:23:53.816057 systemd-tmpfiles[1375]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Dec 16 12:23:53.816494 systemd-tmpfiles[1375]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Dec 16 12:23:53.818074 systemd-tmpfiles[1375]: ACLs are not supported, ignoring. Dec 16 12:23:53.818315 systemd-tmpfiles[1375]: ACLs are not supported, ignoring. Dec 16 12:23:53.824066 systemd-tmpfiles[1375]: Detected autofs mount point /boot during canonicalization of boot. Dec 16 12:23:53.824608 systemd-tmpfiles[1375]: Skipping /boot Dec 16 12:23:53.834563 systemd-tmpfiles[1375]: Detected autofs mount point /boot during canonicalization of boot. Dec 16 12:23:53.834574 systemd-tmpfiles[1375]: Skipping /boot Dec 16 12:23:53.874444 zram_generator::config[1410]: No configuration found. Dec 16 12:23:54.042672 systemd[1]: Reloading finished in 236 ms. Dec 16 12:23:54.066289 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Dec 16 12:23:54.066000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:54.069000 audit: BPF prog-id=41 op=LOAD Dec 16 12:23:54.069000 audit: BPF prog-id=34 op=UNLOAD Dec 16 12:23:54.069000 audit: BPF prog-id=42 op=LOAD Dec 16 12:23:54.069000 audit: BPF prog-id=43 op=LOAD Dec 16 12:23:54.069000 audit: BPF prog-id=35 op=UNLOAD Dec 16 12:23:54.069000 audit: BPF prog-id=36 op=UNLOAD Dec 16 12:23:54.069000 audit: BPF prog-id=44 op=LOAD Dec 16 12:23:54.069000 audit: BPF prog-id=37 op=UNLOAD Dec 16 12:23:54.070000 audit: BPF prog-id=45 op=LOAD Dec 16 12:23:54.070000 audit: BPF prog-id=28 op=UNLOAD Dec 16 12:23:54.070000 audit: BPF prog-id=46 op=LOAD Dec 16 12:23:54.070000 audit: BPF prog-id=47 op=LOAD Dec 16 12:23:54.071000 audit: BPF prog-id=29 op=UNLOAD Dec 16 12:23:54.071000 audit: BPF prog-id=30 op=UNLOAD Dec 16 12:23:54.071000 audit: BPF prog-id=48 op=LOAD Dec 16 12:23:54.071000 audit: BPF prog-id=31 op=UNLOAD Dec 16 12:23:54.071000 audit: BPF prog-id=49 op=LOAD Dec 16 12:23:54.071000 audit: BPF prog-id=50 op=LOAD Dec 16 12:23:54.072000 audit: BPF prog-id=32 op=UNLOAD Dec 16 12:23:54.072000 audit: BPF prog-id=33 op=UNLOAD Dec 16 12:23:54.072000 audit: BPF prog-id=51 op=LOAD Dec 16 12:23:54.072000 audit: BPF prog-id=38 op=UNLOAD Dec 16 12:23:54.072000 audit: BPF prog-id=52 op=LOAD Dec 16 12:23:54.072000 audit: BPF prog-id=53 op=LOAD Dec 16 12:23:54.072000 audit: BPF prog-id=39 op=UNLOAD Dec 16 12:23:54.072000 audit: BPF prog-id=40 op=UNLOAD Dec 16 12:23:54.076393 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Dec 16 12:23:54.076000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:54.084866 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 12:23:54.087361 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Dec 16 12:23:54.094880 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Dec 16 12:23:54.101733 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Dec 16 12:23:54.102000 audit: BPF prog-id=8 op=UNLOAD Dec 16 12:23:54.102000 audit: BPF prog-id=7 op=UNLOAD Dec 16 12:23:54.103000 audit: BPF prog-id=54 op=LOAD Dec 16 12:23:54.103000 audit: BPF prog-id=55 op=LOAD Dec 16 12:23:54.107913 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Dec 16 12:23:54.114651 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Dec 16 12:23:54.120875 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 12:23:54.124503 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Dec 16 12:23:54.128643 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Dec 16 12:23:54.135393 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Dec 16 12:23:54.137502 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 12:23:54.137747 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Dec 16 12:23:54.137846 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 12:23:54.143794 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 12:23:54.143963 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 12:23:54.144096 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Dec 16 12:23:54.144175 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 12:23:54.147975 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Dec 16 12:23:54.157808 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Dec 16 12:23:54.159566 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Dec 16 12:23:54.160418 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Dec 16 12:23:54.160523 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Dec 16 12:23:54.168000 audit[1456]: SYSTEM_BOOT pid=1456 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Dec 16 12:23:54.176527 systemd[1]: Finished ensure-sysext.service. Dec 16 12:23:54.177000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:54.181000 audit: BPF prog-id=56 op=LOAD Dec 16 12:23:54.185645 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Dec 16 12:23:54.206898 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Dec 16 12:23:54.209000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:54.213000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:54.213000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:54.213084 systemd[1]: modprobe@loop.service: Deactivated successfully. Dec 16 12:23:54.213418 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Dec 16 12:23:54.215586 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Dec 16 12:23:54.215812 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Dec 16 12:23:54.217000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:54.217000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:54.217628 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Dec 16 12:23:54.227001 systemd[1]: modprobe@drm.service: Deactivated successfully. Dec 16 12:23:54.228000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:54.228000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:54.228336 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Dec 16 12:23:54.231665 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Dec 16 12:23:54.231882 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Dec 16 12:23:54.234000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:54.234000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:54.236422 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Dec 16 12:23:54.238000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=clean-ca-certificates comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:54.239492 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Dec 16 12:23:54.240242 systemd-udevd[1454]: Using default interface naming scheme 'v257'. Dec 16 12:23:54.240000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:23:54.241975 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Dec 16 12:23:54.242895 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Dec 16 12:23:54.265000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Dec 16 12:23:54.265000 audit[1490]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffc6098100 a2=420 a3=0 items=0 ppid=1450 pid=1490 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:23:54.265000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Dec 16 12:23:54.266156 augenrules[1490]: No rules Dec 16 12:23:54.268602 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 12:23:54.269490 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 12:23:54.286155 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Dec 16 12:23:54.290370 systemd[1]: Starting systemd-networkd.service - Network Configuration... Dec 16 12:23:54.318012 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Dec 16 12:23:54.319470 systemd[1]: Reached target time-set.target - System Time Set. Dec 16 12:23:54.381816 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Dec 16 12:23:54.429829 systemd-networkd[1501]: lo: Link UP Dec 16 12:23:54.430103 systemd-networkd[1501]: lo: Gained carrier Dec 16 12:23:54.434805 systemd[1]: Started systemd-networkd.service - Network Configuration. Dec 16 12:23:54.436263 systemd[1]: Reached target network.target - Network. Dec 16 12:23:54.440595 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Dec 16 12:23:54.444383 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Dec 16 12:23:54.499330 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Dec 16 12:23:54.508048 systemd-networkd[1501]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:23:54.508866 systemd-networkd[1501]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 12:23:54.511556 systemd-networkd[1501]: eth1: Link UP Dec 16 12:23:54.512983 systemd-networkd[1501]: eth1: Gained carrier Dec 16 12:23:54.513011 systemd-networkd[1501]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:23:54.521073 systemd-networkd[1501]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:23:54.521086 systemd-networkd[1501]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Dec 16 12:23:54.523986 systemd-networkd[1501]: eth0: Link UP Dec 16 12:23:54.526340 systemd-networkd[1501]: eth0: Gained carrier Dec 16 12:23:54.526368 systemd-networkd[1501]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Dec 16 12:23:54.553867 systemd-networkd[1501]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Dec 16 12:23:54.559015 systemd-timesyncd[1469]: Network configuration changed, trying to establish connection. Dec 16 12:23:54.580103 kernel: mousedev: PS/2 mouse device common for all mice Dec 16 12:23:54.588370 systemd-networkd[1501]: eth0: DHCPv4 address 128.140.49.38/32, gateway 172.31.1.1 acquired from 172.31.1.1 Dec 16 12:23:54.589999 systemd-timesyncd[1469]: Network configuration changed, trying to establish connection. Dec 16 12:23:54.607923 ldconfig[1452]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Dec 16 12:23:54.613031 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Dec 16 12:23:54.617454 systemd[1]: Starting systemd-update-done.service - Update is Completed... Dec 16 12:23:54.658438 systemd[1]: Finished systemd-update-done.service - Update is Completed. Dec 16 12:23:54.662753 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Dec 16 12:23:54.663730 systemd[1]: Reached target sysinit.target - System Initialization. Dec 16 12:23:54.664873 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Dec 16 12:23:54.666530 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Dec 16 12:23:54.667502 systemd[1]: Started logrotate.timer - Daily rotation of log files. Dec 16 12:23:54.668581 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Dec 16 12:23:54.669762 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Dec 16 12:23:54.670711 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Dec 16 12:23:54.671477 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Dec 16 12:23:54.672247 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Dec 16 12:23:54.672302 systemd[1]: Reached target paths.target - Path Units. Dec 16 12:23:54.673162 systemd[1]: Reached target timers.target - Timer Units. Dec 16 12:23:54.675989 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Dec 16 12:23:54.679015 systemd[1]: Starting docker.socket - Docker Socket for the API... Dec 16 12:23:54.682620 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Dec 16 12:23:54.683841 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Dec 16 12:23:54.685628 systemd[1]: Reached target ssh-access.target - SSH Access Available. Dec 16 12:23:54.689577 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Dec 16 12:23:54.692449 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Dec 16 12:23:54.696741 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Dec 16 12:23:54.699350 systemd[1]: Listening on docker.socket - Docker Socket for the API. Dec 16 12:23:54.705565 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Dec 16 12:23:54.707602 systemd[1]: Reached target sockets.target - Socket Units. Dec 16 12:23:54.709904 systemd[1]: Reached target basic.target - Basic System. Dec 16 12:23:54.710603 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Dec 16 12:23:54.710625 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Dec 16 12:23:54.720522 systemd[1]: Starting containerd.service - containerd container runtime... Dec 16 12:23:54.723685 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Dec 16 12:23:54.728524 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Dec 16 12:23:54.732566 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Dec 16 12:23:54.738626 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Dec 16 12:23:54.744626 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Dec 16 12:23:54.745392 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Dec 16 12:23:54.750020 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Dec 16 12:23:54.753540 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Dec 16 12:23:54.756387 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Dec 16 12:23:54.762901 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Dec 16 12:23:54.769573 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Dec 16 12:23:54.777561 systemd[1]: Starting systemd-logind.service - User Login Management... Dec 16 12:23:54.778366 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Dec 16 12:23:54.778978 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Dec 16 12:23:54.784559 systemd[1]: Starting update-engine.service - Update Engine... Dec 16 12:23:54.791797 jq[1569]: false Dec 16 12:23:54.792351 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Dec 16 12:23:54.798365 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Dec 16 12:23:54.800423 extend-filesystems[1570]: Found /dev/sda6 Dec 16 12:23:54.810076 extend-filesystems[1570]: Found /dev/sda9 Dec 16 12:23:54.805333 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Dec 16 12:23:54.823015 extend-filesystems[1570]: Checking size of /dev/sda9 Dec 16 12:23:54.806377 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Dec 16 12:23:54.806613 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Dec 16 12:23:54.841499 extend-filesystems[1570]: Resized partition /dev/sda9 Dec 16 12:23:54.852524 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 8410107 blocks Dec 16 12:23:54.852606 extend-filesystems[1607]: resize2fs 1.47.3 (8-Jul-2025) Dec 16 12:23:54.849825 systemd[1]: motdgen.service: Deactivated successfully. Dec 16 12:23:54.851199 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Dec 16 12:23:54.860848 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Dec 16 12:23:54.861116 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Dec 16 12:23:54.878381 jq[1583]: true Dec 16 12:23:54.892592 coreos-metadata[1564]: Dec 16 12:23:54.892 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Dec 16 12:23:54.896095 coreos-metadata[1564]: Dec 16 12:23:54.894 INFO Fetch successful Dec 16 12:23:54.897412 coreos-metadata[1564]: Dec 16 12:23:54.897 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Dec 16 12:23:54.900007 coreos-metadata[1564]: Dec 16 12:23:54.898 INFO Fetch successful Dec 16 12:23:54.903980 tar[1598]: linux-arm64/LICENSE Dec 16 12:23:54.909860 dbus-daemon[1565]: [system] SELinux support is enabled Dec 16 12:23:54.910392 systemd[1]: Started dbus.service - D-Bus System Message Bus. Dec 16 12:23:54.916886 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Dec 16 12:23:54.917081 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Dec 16 12:23:54.918857 tar[1598]: linux-arm64/helm Dec 16 12:23:54.920741 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Dec 16 12:23:54.920786 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Dec 16 12:23:54.927781 update_engine[1582]: I20251216 12:23:54.927551 1582 main.cc:92] Flatcar Update Engine starting Dec 16 12:23:54.931595 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Dec 16 12:23:54.931662 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Dec 16 12:23:54.931675 kernel: [drm] features: -context_init Dec 16 12:23:54.933509 kernel: [drm] number of scanouts: 1 Dec 16 12:23:54.933569 kernel: [drm] number of cap sets: 0 Dec 16 12:23:54.934134 jq[1614]: true Dec 16 12:23:54.937146 systemd[1]: Started update-engine.service - Update Engine. Dec 16 12:23:54.940123 update_engine[1582]: I20251216 12:23:54.938217 1582 update_check_scheduler.cc:74] Next update check in 5m35s Dec 16 12:23:54.954710 systemd[1]: Started locksmithd.service - Cluster reboot manager. Dec 16 12:23:54.964546 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Dec 16 12:23:54.983378 kernel: EXT4-fs (sda9): resized filesystem to 8410107 Dec 16 12:23:54.996991 extend-filesystems[1607]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Dec 16 12:23:54.996991 extend-filesystems[1607]: old_desc_blocks = 1, new_desc_blocks = 5 Dec 16 12:23:54.996991 extend-filesystems[1607]: The filesystem on /dev/sda9 is now 8410107 (4k) blocks long. Dec 16 12:23:54.995200 systemd[1]: extend-filesystems.service: Deactivated successfully. Dec 16 12:23:55.006578 extend-filesystems[1570]: Resized filesystem in /dev/sda9 Dec 16 12:23:54.996358 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Dec 16 12:23:55.065538 bash[1646]: Updated "/home/core/.ssh/authorized_keys" Dec 16 12:23:55.095879 kernel: Console: switching to colour frame buffer device 160x50 Dec 16 12:23:55.105899 containerd[1611]: time="2025-12-16T12:23:55Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Dec 16 12:23:55.106910 containerd[1611]: time="2025-12-16T12:23:55.106415240Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Dec 16 12:23:55.132079 containerd[1611]: time="2025-12-16T12:23:55.132026320Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.96µs" Dec 16 12:23:55.132079 containerd[1611]: time="2025-12-16T12:23:55.132068720Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Dec 16 12:23:55.132192 containerd[1611]: time="2025-12-16T12:23:55.132115000Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Dec 16 12:23:55.132192 containerd[1611]: time="2025-12-16T12:23:55.132127200Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Dec 16 12:23:55.135314 containerd[1611]: time="2025-12-16T12:23:55.134406200Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Dec 16 12:23:55.135314 containerd[1611]: time="2025-12-16T12:23:55.134456680Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 16 12:23:55.135314 containerd[1611]: time="2025-12-16T12:23:55.134521560Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Dec 16 12:23:55.135314 containerd[1611]: time="2025-12-16T12:23:55.134533440Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 16 12:23:55.135314 containerd[1611]: time="2025-12-16T12:23:55.134770040Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Dec 16 12:23:55.135314 containerd[1611]: time="2025-12-16T12:23:55.134785000Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 16 12:23:55.135314 containerd[1611]: time="2025-12-16T12:23:55.134796880Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Dec 16 12:23:55.135314 containerd[1611]: time="2025-12-16T12:23:55.134805320Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Dec 16 12:23:55.135314 containerd[1611]: time="2025-12-16T12:23:55.134937120Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Dec 16 12:23:55.135314 containerd[1611]: time="2025-12-16T12:23:55.134949080Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Dec 16 12:23:55.135314 containerd[1611]: time="2025-12-16T12:23:55.135023560Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Dec 16 12:23:55.135314 containerd[1611]: time="2025-12-16T12:23:55.135205240Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 16 12:23:55.135600 containerd[1611]: time="2025-12-16T12:23:55.135234160Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Dec 16 12:23:55.135600 containerd[1611]: time="2025-12-16T12:23:55.135244520Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Dec 16 12:23:55.135600 containerd[1611]: time="2025-12-16T12:23:55.135288720Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Dec 16 12:23:55.135600 containerd[1611]: time="2025-12-16T12:23:55.135495880Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Dec 16 12:23:55.135600 containerd[1611]: time="2025-12-16T12:23:55.135562760Z" level=info msg="metadata content store policy set" policy=shared Dec 16 12:23:55.141735 containerd[1611]: time="2025-12-16T12:23:55.141610040Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Dec 16 12:23:55.141735 containerd[1611]: time="2025-12-16T12:23:55.141690480Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Dec 16 12:23:55.141887 containerd[1611]: time="2025-12-16T12:23:55.141784200Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Dec 16 12:23:55.141887 containerd[1611]: time="2025-12-16T12:23:55.141797920Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Dec 16 12:23:55.141887 containerd[1611]: time="2025-12-16T12:23:55.141810760Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Dec 16 12:23:55.141887 containerd[1611]: time="2025-12-16T12:23:55.141824880Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Dec 16 12:23:55.141887 containerd[1611]: time="2025-12-16T12:23:55.141838680Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Dec 16 12:23:55.141887 containerd[1611]: time="2025-12-16T12:23:55.141851040Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Dec 16 12:23:55.141887 containerd[1611]: time="2025-12-16T12:23:55.141864520Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Dec 16 12:23:55.141887 containerd[1611]: time="2025-12-16T12:23:55.141880800Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Dec 16 12:23:55.142022 containerd[1611]: time="2025-12-16T12:23:55.141893920Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Dec 16 12:23:55.142022 containerd[1611]: time="2025-12-16T12:23:55.141905800Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Dec 16 12:23:55.142022 containerd[1611]: time="2025-12-16T12:23:55.141916120Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Dec 16 12:23:55.142022 containerd[1611]: time="2025-12-16T12:23:55.141928560Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Dec 16 12:23:55.142083 containerd[1611]: time="2025-12-16T12:23:55.142072440Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Dec 16 12:23:55.142104 containerd[1611]: time="2025-12-16T12:23:55.142092200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Dec 16 12:23:55.142126 containerd[1611]: time="2025-12-16T12:23:55.142110920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Dec 16 12:23:55.142152 containerd[1611]: time="2025-12-16T12:23:55.142124320Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Dec 16 12:23:55.142152 containerd[1611]: time="2025-12-16T12:23:55.142137440Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Dec 16 12:23:55.142238 containerd[1611]: time="2025-12-16T12:23:55.142150520Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Dec 16 12:23:55.142238 containerd[1611]: time="2025-12-16T12:23:55.142179840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Dec 16 12:23:55.142238 containerd[1611]: time="2025-12-16T12:23:55.142192840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Dec 16 12:23:55.142238 containerd[1611]: time="2025-12-16T12:23:55.142204840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Dec 16 12:23:55.142238 containerd[1611]: time="2025-12-16T12:23:55.142216640Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Dec 16 12:23:55.142238 containerd[1611]: time="2025-12-16T12:23:55.142227040Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Dec 16 12:23:55.142381 containerd[1611]: time="2025-12-16T12:23:55.142258520Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Dec 16 12:23:55.144183 containerd[1611]: time="2025-12-16T12:23:55.143513080Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Dec 16 12:23:55.144183 containerd[1611]: time="2025-12-16T12:23:55.143546640Z" level=info msg="Start snapshots syncer" Dec 16 12:23:55.144183 containerd[1611]: time="2025-12-16T12:23:55.143798480Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Dec 16 12:23:55.144308 containerd[1611]: time="2025-12-16T12:23:55.144089720Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Dec 16 12:23:55.144308 containerd[1611]: time="2025-12-16T12:23:55.144218160Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Dec 16 12:23:55.145726 containerd[1611]: time="2025-12-16T12:23:55.145320240Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Dec 16 12:23:55.145726 containerd[1611]: time="2025-12-16T12:23:55.145698960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Dec 16 12:23:55.145726 containerd[1611]: time="2025-12-16T12:23:55.145728960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Dec 16 12:23:55.145853 containerd[1611]: time="2025-12-16T12:23:55.145747920Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Dec 16 12:23:55.145853 containerd[1611]: time="2025-12-16T12:23:55.145761200Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Dec 16 12:23:55.145853 containerd[1611]: time="2025-12-16T12:23:55.145783560Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Dec 16 12:23:55.145853 containerd[1611]: time="2025-12-16T12:23:55.145794840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Dec 16 12:23:55.145853 containerd[1611]: time="2025-12-16T12:23:55.145807640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Dec 16 12:23:55.145853 containerd[1611]: time="2025-12-16T12:23:55.145820400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Dec 16 12:23:55.145853 containerd[1611]: time="2025-12-16T12:23:55.145835160Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Dec 16 12:23:55.145990 containerd[1611]: time="2025-12-16T12:23:55.145887520Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 16 12:23:55.145990 containerd[1611]: time="2025-12-16T12:23:55.145904600Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Dec 16 12:23:55.145990 containerd[1611]: time="2025-12-16T12:23:55.145916200Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 16 12:23:55.145990 containerd[1611]: time="2025-12-16T12:23:55.145937920Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Dec 16 12:23:55.145990 containerd[1611]: time="2025-12-16T12:23:55.145948440Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Dec 16 12:23:55.145990 containerd[1611]: time="2025-12-16T12:23:55.145960680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Dec 16 12:23:55.145990 containerd[1611]: time="2025-12-16T12:23:55.145973400Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Dec 16 12:23:55.145990 containerd[1611]: time="2025-12-16T12:23:55.145985400Z" level=info msg="runtime interface created" Dec 16 12:23:55.146136 containerd[1611]: time="2025-12-16T12:23:55.145991480Z" level=info msg="created NRI interface" Dec 16 12:23:55.146136 containerd[1611]: time="2025-12-16T12:23:55.146009480Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Dec 16 12:23:55.146136 containerd[1611]: time="2025-12-16T12:23:55.146024320Z" level=info msg="Connect containerd service" Dec 16 12:23:55.146136 containerd[1611]: time="2025-12-16T12:23:55.146044920Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Dec 16 12:23:55.151261 containerd[1611]: time="2025-12-16T12:23:55.150785360Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 16 12:23:55.188395 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Dec 16 12:23:55.217661 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Dec 16 12:23:55.237614 systemd[1]: Starting sshkeys.service... Dec 16 12:23:55.256028 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Dec 16 12:23:55.257533 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Dec 16 12:23:55.261822 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:23:55.286597 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Dec 16 12:23:55.291654 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Dec 16 12:23:55.354742 coreos-metadata[1666]: Dec 16 12:23:55.353 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Dec 16 12:23:55.354742 coreos-metadata[1666]: Dec 16 12:23:55.354 INFO Fetch successful Dec 16 12:23:55.359959 unknown[1666]: wrote ssh authorized keys file for user: core Dec 16 12:23:55.396387 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Dec 16 12:23:55.398343 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:23:55.402805 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Dec 16 12:23:55.403613 update-ssh-keys[1675]: Updated "/home/core/.ssh/authorized_keys" Dec 16 12:23:55.406634 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Dec 16 12:23:55.410909 systemd[1]: Finished sshkeys.service. Dec 16 12:23:55.471330 containerd[1611]: time="2025-12-16T12:23:55.467971080Z" level=info msg="Start subscribing containerd event" Dec 16 12:23:55.471330 containerd[1611]: time="2025-12-16T12:23:55.468393800Z" level=info msg="Start recovering state" Dec 16 12:23:55.471330 containerd[1611]: time="2025-12-16T12:23:55.468522320Z" level=info msg="Start event monitor" Dec 16 12:23:55.471330 containerd[1611]: time="2025-12-16T12:23:55.468972160Z" level=info msg="Start cni network conf syncer for default" Dec 16 12:23:55.471330 containerd[1611]: time="2025-12-16T12:23:55.468991720Z" level=info msg="Start streaming server" Dec 16 12:23:55.471330 containerd[1611]: time="2025-12-16T12:23:55.469000840Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Dec 16 12:23:55.471330 containerd[1611]: time="2025-12-16T12:23:55.469007800Z" level=info msg="runtime interface starting up..." Dec 16 12:23:55.471330 containerd[1611]: time="2025-12-16T12:23:55.469014080Z" level=info msg="starting plugins..." Dec 16 12:23:55.471330 containerd[1611]: time="2025-12-16T12:23:55.469368200Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Dec 16 12:23:55.473442 containerd[1611]: time="2025-12-16T12:23:55.471704000Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Dec 16 12:23:55.473442 containerd[1611]: time="2025-12-16T12:23:55.471773080Z" level=info msg=serving... address=/run/containerd/containerd.sock Dec 16 12:23:55.473442 containerd[1611]: time="2025-12-16T12:23:55.472787640Z" level=info msg="containerd successfully booted in 0.367417s" Dec 16 12:23:55.472022 systemd[1]: Started containerd.service - containerd container runtime. Dec 16 12:23:55.543506 locksmithd[1621]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Dec 16 12:23:55.559007 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Dec 16 12:23:55.568069 systemd-logind[1580]: New seat seat0. Dec 16 12:23:55.575653 systemd-logind[1580]: Watching system buttons on /dev/input/event0 (Power Button) Dec 16 12:23:55.575682 systemd-logind[1580]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Dec 16 12:23:55.576538 systemd[1]: Started systemd-logind.service - User Login Management. Dec 16 12:23:55.744422 tar[1598]: linux-arm64/README.md Dec 16 12:23:55.764350 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Dec 16 12:23:55.826547 sshd_keygen[1595]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Dec 16 12:23:55.851518 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Dec 16 12:23:55.856052 systemd[1]: Starting issuegen.service - Generate /run/issue... Dec 16 12:23:55.877014 systemd[1]: issuegen.service: Deactivated successfully. Dec 16 12:23:55.877367 systemd[1]: Finished issuegen.service - Generate /run/issue. Dec 16 12:23:55.881965 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Dec 16 12:23:55.906484 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Dec 16 12:23:55.909937 systemd[1]: Started getty@tty1.service - Getty on tty1. Dec 16 12:23:55.913730 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Dec 16 12:23:55.914635 systemd[1]: Reached target getty.target - Login Prompts. Dec 16 12:23:56.014473 systemd-networkd[1501]: eth1: Gained IPv6LL Dec 16 12:23:56.015365 systemd-timesyncd[1469]: Network configuration changed, trying to establish connection. Dec 16 12:23:56.018768 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Dec 16 12:23:56.020197 systemd[1]: Reached target network-online.target - Network is Online. Dec 16 12:23:56.023114 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:23:56.026558 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Dec 16 12:23:56.065848 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Dec 16 12:23:56.398487 systemd-networkd[1501]: eth0: Gained IPv6LL Dec 16 12:23:56.398969 systemd-timesyncd[1469]: Network configuration changed, trying to establish connection. Dec 16 12:23:56.821724 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:23:56.824361 systemd[1]: Reached target multi-user.target - Multi-User System. Dec 16 12:23:56.830375 systemd[1]: Startup finished in 1.877s (kernel) + 5.059s (initrd) + 4.662s (userspace) = 11.600s. Dec 16 12:23:56.836772 (kubelet)[1736]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:23:57.306994 kubelet[1736]: E1216 12:23:57.306854 1736 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:23:57.309510 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:23:57.309654 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:23:57.310249 systemd[1]: kubelet.service: Consumed 830ms CPU time, 246.8M memory peak. Dec 16 12:24:07.431311 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Dec 16 12:24:07.434282 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:24:07.607123 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:24:07.624791 (kubelet)[1754]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:24:07.680567 kubelet[1754]: E1216 12:24:07.680501 1754 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:24:07.684852 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:24:07.685136 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:24:07.685843 systemd[1]: kubelet.service: Consumed 181ms CPU time, 107.1M memory peak. Dec 16 12:24:17.931016 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Dec 16 12:24:17.933526 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:24:18.101144 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:24:18.113718 (kubelet)[1770]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:24:18.156917 kubelet[1770]: E1216 12:24:18.156851 1770 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:24:18.159790 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:24:18.159973 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:24:18.161082 systemd[1]: kubelet.service: Consumed 164ms CPU time, 106.8M memory peak. Dec 16 12:24:26.436380 systemd-timesyncd[1469]: Contacted time server 185.233.107.180:123 (2.flatcar.pool.ntp.org). Dec 16 12:24:26.436468 systemd-timesyncd[1469]: Initial clock synchronization to Tue 2025-12-16 12:24:26.371903 UTC. Dec 16 12:24:28.181475 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Dec 16 12:24:28.185020 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:24:28.368113 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:24:28.383058 (kubelet)[1785]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:24:28.432492 kubelet[1785]: E1216 12:24:28.432374 1785 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:24:28.435150 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:24:28.435328 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:24:28.437396 systemd[1]: kubelet.service: Consumed 183ms CPU time, 106.7M memory peak. Dec 16 12:24:29.388531 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Dec 16 12:24:29.391666 systemd[1]: Started sshd@0-128.140.49.38:22-139.178.89.65:48142.service - OpenSSH per-connection server daemon (139.178.89.65:48142). Dec 16 12:24:30.280079 sshd[1793]: Accepted publickey for core from 139.178.89.65 port 48142 ssh2: RSA SHA256:7GRNP2Xo+ztu9NygLqELCx+Z/yej5nUSnLKe9XvFTMI Dec 16 12:24:30.283694 sshd-session[1793]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:24:30.293645 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Dec 16 12:24:30.294926 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Dec 16 12:24:30.303245 systemd-logind[1580]: New session 1 of user core. Dec 16 12:24:30.325643 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Dec 16 12:24:30.329355 systemd[1]: Starting user@500.service - User Manager for UID 500... Dec 16 12:24:30.344566 (systemd)[1798]: pam_unix(systemd-user:session): session opened for user core(uid=500) by (uid=0) Dec 16 12:24:30.349638 systemd-logind[1580]: New session c1 of user core. Dec 16 12:24:30.479918 systemd[1798]: Queued start job for default target default.target. Dec 16 12:24:30.490480 systemd[1798]: Created slice app.slice - User Application Slice. Dec 16 12:24:30.490777 systemd[1798]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Dec 16 12:24:30.490979 systemd[1798]: Reached target paths.target - Paths. Dec 16 12:24:30.491216 systemd[1798]: Reached target timers.target - Timers. Dec 16 12:24:30.494450 systemd[1798]: Starting dbus.socket - D-Bus User Message Bus Socket... Dec 16 12:24:30.496433 systemd[1798]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Dec 16 12:24:30.509984 systemd[1798]: Listening on dbus.socket - D-Bus User Message Bus Socket. Dec 16 12:24:30.510052 systemd[1798]: Reached target sockets.target - Sockets. Dec 16 12:24:30.513835 systemd[1798]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Dec 16 12:24:30.514032 systemd[1798]: Reached target basic.target - Basic System. Dec 16 12:24:30.514129 systemd[1798]: Reached target default.target - Main User Target. Dec 16 12:24:30.514173 systemd[1798]: Startup finished in 155ms. Dec 16 12:24:30.514477 systemd[1]: Started user@500.service - User Manager for UID 500. Dec 16 12:24:30.519538 systemd[1]: Started session-1.scope - Session 1 of User core. Dec 16 12:24:31.026524 systemd[1]: Started sshd@1-128.140.49.38:22-139.178.89.65:38468.service - OpenSSH per-connection server daemon (139.178.89.65:38468). Dec 16 12:24:31.916716 sshd[1811]: Accepted publickey for core from 139.178.89.65 port 38468 ssh2: RSA SHA256:7GRNP2Xo+ztu9NygLqELCx+Z/yej5nUSnLKe9XvFTMI Dec 16 12:24:31.919862 sshd-session[1811]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:24:31.928372 systemd-logind[1580]: New session 2 of user core. Dec 16 12:24:31.934932 systemd[1]: Started session-2.scope - Session 2 of User core. Dec 16 12:24:32.421187 sshd[1814]: Connection closed by 139.178.89.65 port 38468 Dec 16 12:24:32.422336 sshd-session[1811]: pam_unix(sshd:session): session closed for user core Dec 16 12:24:32.428139 systemd[1]: sshd@1-128.140.49.38:22-139.178.89.65:38468.service: Deactivated successfully. Dec 16 12:24:32.431885 systemd[1]: session-2.scope: Deactivated successfully. Dec 16 12:24:32.433366 systemd-logind[1580]: Session 2 logged out. Waiting for processes to exit. Dec 16 12:24:32.435369 systemd-logind[1580]: Removed session 2. Dec 16 12:24:32.609350 systemd[1]: Started sshd@2-128.140.49.38:22-139.178.89.65:38476.service - OpenSSH per-connection server daemon (139.178.89.65:38476). Dec 16 12:24:33.520331 sshd[1820]: Accepted publickey for core from 139.178.89.65 port 38476 ssh2: RSA SHA256:7GRNP2Xo+ztu9NygLqELCx+Z/yej5nUSnLKe9XvFTMI Dec 16 12:24:33.523393 sshd-session[1820]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:24:33.529403 systemd-logind[1580]: New session 3 of user core. Dec 16 12:24:33.535662 systemd[1]: Started session-3.scope - Session 3 of User core. Dec 16 12:24:34.034280 sshd[1823]: Connection closed by 139.178.89.65 port 38476 Dec 16 12:24:34.034907 sshd-session[1820]: pam_unix(sshd:session): session closed for user core Dec 16 12:24:34.039825 systemd[1]: sshd@2-128.140.49.38:22-139.178.89.65:38476.service: Deactivated successfully. Dec 16 12:24:34.041957 systemd[1]: session-3.scope: Deactivated successfully. Dec 16 12:24:34.043866 systemd-logind[1580]: Session 3 logged out. Waiting for processes to exit. Dec 16 12:24:34.045989 systemd-logind[1580]: Removed session 3. Dec 16 12:24:34.219905 systemd[1]: Started sshd@3-128.140.49.38:22-139.178.89.65:38490.service - OpenSSH per-connection server daemon (139.178.89.65:38490). Dec 16 12:24:35.116505 sshd[1829]: Accepted publickey for core from 139.178.89.65 port 38490 ssh2: RSA SHA256:7GRNP2Xo+ztu9NygLqELCx+Z/yej5nUSnLKe9XvFTMI Dec 16 12:24:35.119331 sshd-session[1829]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:24:35.125736 systemd-logind[1580]: New session 4 of user core. Dec 16 12:24:35.133658 systemd[1]: Started session-4.scope - Session 4 of User core. Dec 16 12:24:35.628698 sshd[1832]: Connection closed by 139.178.89.65 port 38490 Dec 16 12:24:35.629494 sshd-session[1829]: pam_unix(sshd:session): session closed for user core Dec 16 12:24:35.635526 systemd[1]: sshd@3-128.140.49.38:22-139.178.89.65:38490.service: Deactivated successfully. Dec 16 12:24:35.637429 systemd[1]: session-4.scope: Deactivated successfully. Dec 16 12:24:35.638751 systemd-logind[1580]: Session 4 logged out. Waiting for processes to exit. Dec 16 12:24:35.640089 systemd-logind[1580]: Removed session 4. Dec 16 12:24:35.804941 systemd[1]: Started sshd@4-128.140.49.38:22-139.178.89.65:38498.service - OpenSSH per-connection server daemon (139.178.89.65:38498). Dec 16 12:24:36.705575 sshd[1838]: Accepted publickey for core from 139.178.89.65 port 38498 ssh2: RSA SHA256:7GRNP2Xo+ztu9NygLqELCx+Z/yej5nUSnLKe9XvFTMI Dec 16 12:24:36.707198 sshd-session[1838]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:24:36.713511 systemd-logind[1580]: New session 5 of user core. Dec 16 12:24:36.719505 systemd[1]: Started session-5.scope - Session 5 of User core. Dec 16 12:24:37.058173 sudo[1842]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Dec 16 12:24:37.058916 sudo[1842]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:24:37.080094 sudo[1842]: pam_unix(sudo:session): session closed for user root Dec 16 12:24:37.248450 sshd[1841]: Connection closed by 139.178.89.65 port 38498 Dec 16 12:24:37.249857 sshd-session[1838]: pam_unix(sshd:session): session closed for user core Dec 16 12:24:37.257575 systemd[1]: sshd@4-128.140.49.38:22-139.178.89.65:38498.service: Deactivated successfully. Dec 16 12:24:37.261339 systemd[1]: session-5.scope: Deactivated successfully. Dec 16 12:24:37.262378 systemd-logind[1580]: Session 5 logged out. Waiting for processes to exit. Dec 16 12:24:37.263811 systemd-logind[1580]: Removed session 5. Dec 16 12:24:37.434438 systemd[1]: Started sshd@5-128.140.49.38:22-139.178.89.65:38502.service - OpenSSH per-connection server daemon (139.178.89.65:38502). Dec 16 12:24:38.350972 sshd[1848]: Accepted publickey for core from 139.178.89.65 port 38502 ssh2: RSA SHA256:7GRNP2Xo+ztu9NygLqELCx+Z/yej5nUSnLKe9XvFTMI Dec 16 12:24:38.354118 sshd-session[1848]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:24:38.359333 systemd-logind[1580]: New session 6 of user core. Dec 16 12:24:38.366952 systemd[1]: Started session-6.scope - Session 6 of User core. Dec 16 12:24:38.681241 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Dec 16 12:24:38.684348 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:24:38.705197 sudo[1854]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Dec 16 12:24:38.705994 sudo[1854]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:24:38.713449 sudo[1854]: pam_unix(sudo:session): session closed for user root Dec 16 12:24:38.726255 sudo[1853]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Dec 16 12:24:38.726907 sudo[1853]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:24:38.741316 systemd[1]: Starting audit-rules.service - Load Audit Rules... Dec 16 12:24:38.795000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Dec 16 12:24:38.795730 augenrules[1878]: No rules Dec 16 12:24:38.797022 kernel: kauditd_printk_skb: 178 callbacks suppressed Dec 16 12:24:38.797088 kernel: audit: type=1305 audit(1765887878.795:223): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Dec 16 12:24:38.797358 kernel: audit: type=1300 audit(1765887878.795:223): arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=fffff552b9d0 a2=420 a3=0 items=0 ppid=1859 pid=1878 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:38.795000 audit[1878]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=fffff552b9d0 a2=420 a3=0 items=0 ppid=1859 pid=1878 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:38.797927 systemd[1]: audit-rules.service: Deactivated successfully. Dec 16 12:24:38.795000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Dec 16 12:24:38.802477 sudo[1853]: pam_unix(sudo:session): session closed for user root Dec 16 12:24:38.800327 systemd[1]: Finished audit-rules.service - Load Audit Rules. Dec 16 12:24:38.804953 kernel: audit: type=1327 audit(1765887878.795:223): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Dec 16 12:24:38.800000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:24:38.807601 kernel: audit: type=1130 audit(1765887878.800:224): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:24:38.800000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:24:38.810983 kernel: audit: type=1131 audit(1765887878.800:225): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:24:38.813307 kernel: audit: type=1106 audit(1765887878.802:226): pid=1853 uid=500 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:24:38.802000 audit[1853]: USER_END pid=1853 uid=500 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:24:38.802000 audit[1853]: CRED_DISP pid=1853 uid=500 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:24:38.820513 kernel: audit: type=1104 audit(1765887878.802:227): pid=1853 uid=500 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:24:38.864000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:24:38.864244 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:24:38.868319 kernel: audit: type=1130 audit(1765887878.864:228): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:24:38.874637 (kubelet)[1887]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:24:38.917896 kubelet[1887]: E1216 12:24:38.917814 1887 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:24:38.920627 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:24:38.920875 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:24:38.920000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 12:24:38.921540 systemd[1]: kubelet.service: Consumed 170ms CPU time, 106.8M memory peak. Dec 16 12:24:38.924305 kernel: audit: type=1131 audit(1765887878.920:229): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 12:24:38.977368 sshd[1851]: Connection closed by 139.178.89.65 port 38502 Dec 16 12:24:38.976627 sshd-session[1848]: pam_unix(sshd:session): session closed for user core Dec 16 12:24:38.978000 audit[1848]: USER_END pid=1848 uid=0 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:24:38.978000 audit[1848]: CRED_DISP pid=1848 uid=0 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:24:38.982499 kernel: audit: type=1106 audit(1765887878.978:230): pid=1848 uid=0 auid=500 ses=6 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:24:38.983105 systemd[1]: sshd@5-128.140.49.38:22-139.178.89.65:38502.service: Deactivated successfully. Dec 16 12:24:38.982000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@5-128.140.49.38:22-139.178.89.65:38502 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:24:38.983905 systemd-logind[1580]: Session 6 logged out. Waiting for processes to exit. Dec 16 12:24:38.987206 systemd[1]: session-6.scope: Deactivated successfully. Dec 16 12:24:38.990090 systemd-logind[1580]: Removed session 6. Dec 16 12:24:39.160000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-128.140.49.38:22-139.178.89.65:38508 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:24:39.161742 systemd[1]: Started sshd@6-128.140.49.38:22-139.178.89.65:38508.service - OpenSSH per-connection server daemon (139.178.89.65:38508). Dec 16 12:24:40.077000 audit[1899]: USER_ACCT pid=1899 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:24:40.078642 sshd[1899]: Accepted publickey for core from 139.178.89.65 port 38508 ssh2: RSA SHA256:7GRNP2Xo+ztu9NygLqELCx+Z/yej5nUSnLKe9XvFTMI Dec 16 12:24:40.079000 audit[1899]: CRED_ACQ pid=1899 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:24:40.079000 audit[1899]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffffbbccd90 a2=3 a3=0 items=0 ppid=1 pid=1899 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=7 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:40.079000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:24:40.080829 sshd-session[1899]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:24:40.090442 systemd-logind[1580]: New session 7 of user core. Dec 16 12:24:40.092500 systemd[1]: Started session-7.scope - Session 7 of User core. Dec 16 12:24:40.095000 audit[1899]: USER_START pid=1899 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:24:40.097000 audit[1902]: CRED_ACQ pid=1902 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:24:40.231045 update_engine[1582]: I20251216 12:24:40.230484 1582 update_attempter.cc:509] Updating boot flags... Dec 16 12:24:40.433000 audit[1925]: USER_ACCT pid=1925 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:24:40.434000 audit[1925]: CRED_REFR pid=1925 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:24:40.435230 sudo[1925]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Dec 16 12:24:40.435901 sudo[1925]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Dec 16 12:24:40.438000 audit[1925]: USER_START pid=1925 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:24:40.775032 systemd[1]: Starting docker.service - Docker Application Container Engine... Dec 16 12:24:40.788887 (dockerd)[1945]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Dec 16 12:24:41.023438 dockerd[1945]: time="2025-12-16T12:24:41.022925391Z" level=info msg="Starting up" Dec 16 12:24:41.024521 dockerd[1945]: time="2025-12-16T12:24:41.024477866Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Dec 16 12:24:41.042044 dockerd[1945]: time="2025-12-16T12:24:41.041575624Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Dec 16 12:24:41.085352 dockerd[1945]: time="2025-12-16T12:24:41.085044321Z" level=info msg="Loading containers: start." Dec 16 12:24:41.097325 kernel: Initializing XFRM netlink socket Dec 16 12:24:41.156000 audit[1993]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=1993 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:41.156000 audit[1993]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=fffff0ae5b10 a2=0 a3=0 items=0 ppid=1945 pid=1993 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.156000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Dec 16 12:24:41.158000 audit[1995]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=1995 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:41.158000 audit[1995]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=fffff47cf830 a2=0 a3=0 items=0 ppid=1945 pid=1995 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.158000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Dec 16 12:24:41.161000 audit[1997]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=1997 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:41.161000 audit[1997]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe48f7f80 a2=0 a3=0 items=0 ppid=1945 pid=1997 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.161000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Dec 16 12:24:41.165000 audit[1999]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=1999 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:41.165000 audit[1999]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc77930b0 a2=0 a3=0 items=0 ppid=1945 pid=1999 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.165000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Dec 16 12:24:41.168000 audit[2001]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=2001 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:41.168000 audit[2001]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffc2366160 a2=0 a3=0 items=0 ppid=1945 pid=2001 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.168000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Dec 16 12:24:41.170000 audit[2003]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=2003 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:41.170000 audit[2003]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=fffff2f26f30 a2=0 a3=0 items=0 ppid=1945 pid=2003 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.170000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 12:24:41.173000 audit[2005]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=2005 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:41.173000 audit[2005]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffda0c9f20 a2=0 a3=0 items=0 ppid=1945 pid=2005 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.173000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Dec 16 12:24:41.176000 audit[2007]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=2007 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:41.176000 audit[2007]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=ffffe5789fc0 a2=0 a3=0 items=0 ppid=1945 pid=2007 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.176000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Dec 16 12:24:41.204000 audit[2010]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=2010 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:41.204000 audit[2010]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=472 a0=3 a1=ffffc642a4b0 a2=0 a3=0 items=0 ppid=1945 pid=2010 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.204000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Dec 16 12:24:41.207000 audit[2012]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=2012 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:41.207000 audit[2012]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffc7e90ca0 a2=0 a3=0 items=0 ppid=1945 pid=2012 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.207000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Dec 16 12:24:41.210000 audit[2014]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=2014 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:41.210000 audit[2014]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=ffffece89160 a2=0 a3=0 items=0 ppid=1945 pid=2014 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.210000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Dec 16 12:24:41.212000 audit[2016]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=2016 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:41.212000 audit[2016]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=ffffd68e5310 a2=0 a3=0 items=0 ppid=1945 pid=2016 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.212000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 12:24:41.215000 audit[2018]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=2018 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:41.215000 audit[2018]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=ffffc47fcd60 a2=0 a3=0 items=0 ppid=1945 pid=2018 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.215000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Dec 16 12:24:41.265000 audit[2048]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2048 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:24:41.265000 audit[2048]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=fffff5f04940 a2=0 a3=0 items=0 ppid=1945 pid=2048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.265000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Dec 16 12:24:41.269000 audit[2050]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2050 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:24:41.269000 audit[2050]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=fffffb7f90a0 a2=0 a3=0 items=0 ppid=1945 pid=2050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.269000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Dec 16 12:24:41.272000 audit[2052]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2052 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:24:41.272000 audit[2052]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffdcef3330 a2=0 a3=0 items=0 ppid=1945 pid=2052 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.272000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Dec 16 12:24:41.274000 audit[2054]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2054 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:24:41.274000 audit[2054]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffffea689e0 a2=0 a3=0 items=0 ppid=1945 pid=2054 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.274000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Dec 16 12:24:41.277000 audit[2056]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2056 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:24:41.277000 audit[2056]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffef7f0cd0 a2=0 a3=0 items=0 ppid=1945 pid=2056 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.277000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Dec 16 12:24:41.279000 audit[2058]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2058 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:24:41.279000 audit[2058]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffe5dfd300 a2=0 a3=0 items=0 ppid=1945 pid=2058 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.279000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 12:24:41.281000 audit[2060]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2060 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:24:41.281000 audit[2060]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffc4f91cf0 a2=0 a3=0 items=0 ppid=1945 pid=2060 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.281000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Dec 16 12:24:41.283000 audit[2062]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2062 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:24:41.283000 audit[2062]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=ffffe2d40400 a2=0 a3=0 items=0 ppid=1945 pid=2062 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.283000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Dec 16 12:24:41.286000 audit[2064]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2064 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:24:41.286000 audit[2064]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=484 a0=3 a1=fffff7e62110 a2=0 a3=0 items=0 ppid=1945 pid=2064 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.286000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Dec 16 12:24:41.288000 audit[2066]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2066 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:24:41.288000 audit[2066]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffd60eab80 a2=0 a3=0 items=0 ppid=1945 pid=2066 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.288000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Dec 16 12:24:41.290000 audit[2068]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2068 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:24:41.290000 audit[2068]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=ffffd6824bc0 a2=0 a3=0 items=0 ppid=1945 pid=2068 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.290000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Dec 16 12:24:41.292000 audit[2070]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2070 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:24:41.292000 audit[2070]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=ffffee42f160 a2=0 a3=0 items=0 ppid=1945 pid=2070 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.292000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Dec 16 12:24:41.295000 audit[2072]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2072 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:24:41.295000 audit[2072]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=ffffe5aa29d0 a2=0 a3=0 items=0 ppid=1945 pid=2072 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.295000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Dec 16 12:24:41.300000 audit[2077]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2077 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:41.300000 audit[2077]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffc8e61950 a2=0 a3=0 items=0 ppid=1945 pid=2077 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.300000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Dec 16 12:24:41.302000 audit[2079]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2079 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:41.302000 audit[2079]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=fffffe08d3d0 a2=0 a3=0 items=0 ppid=1945 pid=2079 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.302000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Dec 16 12:24:41.304000 audit[2081]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2081 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:41.304000 audit[2081]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=fffffa5f1f60 a2=0 a3=0 items=0 ppid=1945 pid=2081 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.304000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Dec 16 12:24:41.307000 audit[2083]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2083 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:24:41.307000 audit[2083]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffed6289a0 a2=0 a3=0 items=0 ppid=1945 pid=2083 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.307000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Dec 16 12:24:41.310000 audit[2085]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2085 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:24:41.310000 audit[2085]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=ffffcdcca9a0 a2=0 a3=0 items=0 ppid=1945 pid=2085 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.310000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Dec 16 12:24:41.312000 audit[2087]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2087 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:24:41.312000 audit[2087]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffe51897f0 a2=0 a3=0 items=0 ppid=1945 pid=2087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.312000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Dec 16 12:24:41.337000 audit[2091]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2091 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:41.337000 audit[2091]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=520 a0=3 a1=ffffc8daf670 a2=0 a3=0 items=0 ppid=1945 pid=2091 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.337000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Dec 16 12:24:41.340000 audit[2093]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2093 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:41.340000 audit[2093]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=288 a0=3 a1=fffff94e0fb0 a2=0 a3=0 items=0 ppid=1945 pid=2093 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.340000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Dec 16 12:24:41.350000 audit[2101]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2101 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:41.350000 audit[2101]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=300 a0=3 a1=ffffee736b70 a2=0 a3=0 items=0 ppid=1945 pid=2101 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.350000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Dec 16 12:24:41.363000 audit[2107]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2107 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:41.363000 audit[2107]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=376 a0=3 a1=fffff4399f70 a2=0 a3=0 items=0 ppid=1945 pid=2107 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.363000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Dec 16 12:24:41.365000 audit[2109]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2109 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:41.365000 audit[2109]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=512 a0=3 a1=ffffc7047310 a2=0 a3=0 items=0 ppid=1945 pid=2109 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.365000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Dec 16 12:24:41.368000 audit[2111]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2111 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:41.368000 audit[2111]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffff11acf0 a2=0 a3=0 items=0 ppid=1945 pid=2111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.368000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Dec 16 12:24:41.370000 audit[2113]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2113 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:41.370000 audit[2113]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=428 a0=3 a1=fffffe711230 a2=0 a3=0 items=0 ppid=1945 pid=2113 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.370000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Dec 16 12:24:41.373000 audit[2115]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2115 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:41.373000 audit[2115]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffd4865a00 a2=0 a3=0 items=0 ppid=1945 pid=2115 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:41.373000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Dec 16 12:24:41.375326 systemd-networkd[1501]: docker0: Link UP Dec 16 12:24:41.379812 dockerd[1945]: time="2025-12-16T12:24:41.379687389Z" level=info msg="Loading containers: done." Dec 16 12:24:41.406720 dockerd[1945]: time="2025-12-16T12:24:41.405704606Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Dec 16 12:24:41.406720 dockerd[1945]: time="2025-12-16T12:24:41.405930846Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Dec 16 12:24:41.406720 dockerd[1945]: time="2025-12-16T12:24:41.406213176Z" level=info msg="Initializing buildkit" Dec 16 12:24:41.434320 dockerd[1945]: time="2025-12-16T12:24:41.434225719Z" level=info msg="Completed buildkit initialization" Dec 16 12:24:41.443047 dockerd[1945]: time="2025-12-16T12:24:41.442964882Z" level=info msg="Daemon has completed initialization" Dec 16 12:24:41.444000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:24:41.443645 systemd[1]: Started docker.service - Docker Application Container Engine. Dec 16 12:24:41.445111 dockerd[1945]: time="2025-12-16T12:24:41.443690502Z" level=info msg="API listen on /run/docker.sock" Dec 16 12:24:42.054609 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3425277582-merged.mount: Deactivated successfully. Dec 16 12:24:42.184333 containerd[1611]: time="2025-12-16T12:24:42.184258962Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\"" Dec 16 12:24:42.802008 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3184886711.mount: Deactivated successfully. Dec 16 12:24:43.543308 containerd[1611]: time="2025-12-16T12:24:43.542743554Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:24:43.544465 containerd[1611]: time="2025-12-16T12:24:43.544250444Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.3: active requests=0, bytes read=0" Dec 16 12:24:43.545329 containerd[1611]: time="2025-12-16T12:24:43.545286980Z" level=info msg="ImageCreate event name:\"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:24:43.548537 containerd[1611]: time="2025-12-16T12:24:43.548470998Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:24:43.550604 containerd[1611]: time="2025-12-16T12:24:43.550563892Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.3\" with image id \"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.3\", repo digest \"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\", size \"24567639\" in 1.366241637s" Dec 16 12:24:43.550961 containerd[1611]: time="2025-12-16T12:24:43.550722142Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\" returns image reference \"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\"" Dec 16 12:24:43.551620 containerd[1611]: time="2025-12-16T12:24:43.551584603Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\"" Dec 16 12:24:44.609173 containerd[1611]: time="2025-12-16T12:24:44.609065278Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:24:44.610919 containerd[1611]: time="2025-12-16T12:24:44.610865703Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.3: active requests=0, bytes read=0" Dec 16 12:24:44.611912 containerd[1611]: time="2025-12-16T12:24:44.611867231Z" level=info msg="ImageCreate event name:\"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:24:44.614820 containerd[1611]: time="2025-12-16T12:24:44.614791283Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:24:44.616492 containerd[1611]: time="2025-12-16T12:24:44.616449986Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.3\" with image id \"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.3\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\", size \"20719958\" in 1.064497721s" Dec 16 12:24:44.616492 containerd[1611]: time="2025-12-16T12:24:44.616491511Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\" returns image reference \"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\"" Dec 16 12:24:44.617189 containerd[1611]: time="2025-12-16T12:24:44.617167150Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\"" Dec 16 12:24:45.492554 containerd[1611]: time="2025-12-16T12:24:45.491444468Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:24:45.493118 containerd[1611]: time="2025-12-16T12:24:45.493064411Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.3: active requests=0, bytes read=0" Dec 16 12:24:45.494222 containerd[1611]: time="2025-12-16T12:24:45.494174525Z" level=info msg="ImageCreate event name:\"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:24:45.497601 containerd[1611]: time="2025-12-16T12:24:45.497566301Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:24:45.498721 containerd[1611]: time="2025-12-16T12:24:45.498684368Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.3\" with image id \"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.3\", repo digest \"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\", size \"15776215\" in 881.370579ms" Dec 16 12:24:45.498818 containerd[1611]: time="2025-12-16T12:24:45.498802243Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\" returns image reference \"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\"" Dec 16 12:24:45.499642 containerd[1611]: time="2025-12-16T12:24:45.499606818Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\"" Dec 16 12:24:46.413375 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2790105405.mount: Deactivated successfully. Dec 16 12:24:46.661292 containerd[1611]: time="2025-12-16T12:24:46.660879909Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:24:46.663194 containerd[1611]: time="2025-12-16T12:24:46.663141712Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.3: active requests=0, bytes read=0" Dec 16 12:24:46.664475 containerd[1611]: time="2025-12-16T12:24:46.664320362Z" level=info msg="ImageCreate event name:\"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:24:46.669244 containerd[1611]: time="2025-12-16T12:24:46.669176156Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:24:46.670360 containerd[1611]: time="2025-12-16T12:24:46.669941989Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.3\" with image id \"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\", repo tag \"registry.k8s.io/kube-proxy:v1.34.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\", size \"22804272\" in 1.170061728s" Dec 16 12:24:46.670360 containerd[1611]: time="2025-12-16T12:24:46.669987800Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\" returns image reference \"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\"" Dec 16 12:24:46.670659 containerd[1611]: time="2025-12-16T12:24:46.670620758Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Dec 16 12:24:47.284692 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount673677795.mount: Deactivated successfully. Dec 16 12:24:47.916709 containerd[1611]: time="2025-12-16T12:24:47.916636261Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:24:47.919851 containerd[1611]: time="2025-12-16T12:24:47.918921750Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=19575910" Dec 16 12:24:47.919851 containerd[1611]: time="2025-12-16T12:24:47.919710751Z" level=info msg="ImageCreate event name:\"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:24:47.924687 containerd[1611]: time="2025-12-16T12:24:47.924648685Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:24:47.925794 containerd[1611]: time="2025-12-16T12:24:47.925751192Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"20392204\" in 1.25508762s" Dec 16 12:24:47.925794 containerd[1611]: time="2025-12-16T12:24:47.925790090Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\"" Dec 16 12:24:47.926247 containerd[1611]: time="2025-12-16T12:24:47.926225968Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Dec 16 12:24:48.477963 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1797180635.mount: Deactivated successfully. Dec 16 12:24:48.491132 containerd[1611]: time="2025-12-16T12:24:48.491072277Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:24:48.493520 containerd[1611]: time="2025-12-16T12:24:48.493450920Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=0" Dec 16 12:24:48.494576 containerd[1611]: time="2025-12-16T12:24:48.494527796Z" level=info msg="ImageCreate event name:\"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:24:48.498943 containerd[1611]: time="2025-12-16T12:24:48.498118329Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:24:48.498943 containerd[1611]: time="2025-12-16T12:24:48.498771371Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"267939\" in 571.633669ms" Dec 16 12:24:48.498943 containerd[1611]: time="2025-12-16T12:24:48.498805594Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\"" Dec 16 12:24:48.499493 containerd[1611]: time="2025-12-16T12:24:48.499467352Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\"" Dec 16 12:24:48.932102 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Dec 16 12:24:48.935035 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:24:49.129040 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2036605006.mount: Deactivated successfully. Dec 16 12:24:49.175518 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:24:49.176000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:24:49.177813 kernel: kauditd_printk_skb: 134 callbacks suppressed Dec 16 12:24:49.177887 kernel: audit: type=1130 audit(1765887889.176:283): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:24:49.188293 (kubelet)[2297]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Dec 16 12:24:49.247281 kubelet[2297]: E1216 12:24:49.247226 2297 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Dec 16 12:24:49.253239 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Dec 16 12:24:49.253429 systemd[1]: kubelet.service: Failed with result 'exit-code'. Dec 16 12:24:49.256455 systemd[1]: kubelet.service: Consumed 174ms CPU time, 106.9M memory peak. Dec 16 12:24:49.256000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 12:24:49.260305 kernel: audit: type=1131 audit(1765887889.256:284): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 12:24:51.098809 containerd[1611]: time="2025-12-16T12:24:51.098710267Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.4-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:24:51.100324 containerd[1611]: time="2025-12-16T12:24:51.100204860Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.4-0: active requests=0, bytes read=85821047" Dec 16 12:24:51.104304 containerd[1611]: time="2025-12-16T12:24:51.103942362Z" level=info msg="ImageCreate event name:\"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:24:51.106988 containerd[1611]: time="2025-12-16T12:24:51.106929668Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:24:51.108302 containerd[1611]: time="2025-12-16T12:24:51.108112323Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.4-0\" with image id \"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\", repo tag \"registry.k8s.io/etcd:3.6.4-0\", repo digest \"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\", size \"98207481\" in 2.608527626s" Dec 16 12:24:51.108302 containerd[1611]: time="2025-12-16T12:24:51.108151190Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\" returns image reference \"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\"" Dec 16 12:24:56.191000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:24:56.191221 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:24:56.192424 systemd[1]: kubelet.service: Consumed 174ms CPU time, 106.9M memory peak. Dec 16 12:24:56.191000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:24:56.195244 kernel: audit: type=1130 audit(1765887896.191:285): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:24:56.195357 kernel: audit: type=1131 audit(1765887896.191:286): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:24:56.197588 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:24:56.240559 systemd[1]: Reload requested from client PID 2382 ('systemctl') (unit session-7.scope)... Dec 16 12:24:56.240579 systemd[1]: Reloading... Dec 16 12:24:56.386414 zram_generator::config[2436]: No configuration found. Dec 16 12:24:56.572359 systemd[1]: Reloading finished in 331 ms. Dec 16 12:24:56.606600 kernel: audit: type=1334 audit(1765887896.599:287): prog-id=61 op=LOAD Dec 16 12:24:56.606696 kernel: audit: type=1334 audit(1765887896.599:288): prog-id=58 op=UNLOAD Dec 16 12:24:56.606717 kernel: audit: type=1334 audit(1765887896.599:289): prog-id=62 op=LOAD Dec 16 12:24:56.606736 kernel: audit: type=1334 audit(1765887896.599:290): prog-id=63 op=LOAD Dec 16 12:24:56.606755 kernel: audit: type=1334 audit(1765887896.599:291): prog-id=59 op=UNLOAD Dec 16 12:24:56.606824 kernel: audit: type=1334 audit(1765887896.599:292): prog-id=60 op=UNLOAD Dec 16 12:24:56.606846 kernel: audit: type=1334 audit(1765887896.600:293): prog-id=64 op=LOAD Dec 16 12:24:56.606874 kernel: audit: type=1334 audit(1765887896.600:294): prog-id=57 op=UNLOAD Dec 16 12:24:56.599000 audit: BPF prog-id=61 op=LOAD Dec 16 12:24:56.599000 audit: BPF prog-id=58 op=UNLOAD Dec 16 12:24:56.599000 audit: BPF prog-id=62 op=LOAD Dec 16 12:24:56.599000 audit: BPF prog-id=63 op=LOAD Dec 16 12:24:56.599000 audit: BPF prog-id=59 op=UNLOAD Dec 16 12:24:56.599000 audit: BPF prog-id=60 op=UNLOAD Dec 16 12:24:56.600000 audit: BPF prog-id=64 op=LOAD Dec 16 12:24:56.600000 audit: BPF prog-id=57 op=UNLOAD Dec 16 12:24:56.605000 audit: BPF prog-id=65 op=LOAD Dec 16 12:24:56.605000 audit: BPF prog-id=56 op=UNLOAD Dec 16 12:24:56.606000 audit: BPF prog-id=66 op=LOAD Dec 16 12:24:56.613000 audit: BPF prog-id=48 op=UNLOAD Dec 16 12:24:56.613000 audit: BPF prog-id=67 op=LOAD Dec 16 12:24:56.613000 audit: BPF prog-id=68 op=LOAD Dec 16 12:24:56.613000 audit: BPF prog-id=49 op=UNLOAD Dec 16 12:24:56.614000 audit: BPF prog-id=50 op=UNLOAD Dec 16 12:24:56.614000 audit: BPF prog-id=69 op=LOAD Dec 16 12:24:56.615000 audit: BPF prog-id=70 op=LOAD Dec 16 12:24:56.615000 audit: BPF prog-id=54 op=UNLOAD Dec 16 12:24:56.615000 audit: BPF prog-id=55 op=UNLOAD Dec 16 12:24:56.616000 audit: BPF prog-id=71 op=LOAD Dec 16 12:24:56.616000 audit: BPF prog-id=41 op=UNLOAD Dec 16 12:24:56.617000 audit: BPF prog-id=72 op=LOAD Dec 16 12:24:56.617000 audit: BPF prog-id=73 op=LOAD Dec 16 12:24:56.617000 audit: BPF prog-id=42 op=UNLOAD Dec 16 12:24:56.617000 audit: BPF prog-id=43 op=UNLOAD Dec 16 12:24:56.619000 audit: BPF prog-id=74 op=LOAD Dec 16 12:24:56.619000 audit: BPF prog-id=44 op=UNLOAD Dec 16 12:24:56.619000 audit: BPF prog-id=75 op=LOAD Dec 16 12:24:56.619000 audit: BPF prog-id=51 op=UNLOAD Dec 16 12:24:56.620000 audit: BPF prog-id=76 op=LOAD Dec 16 12:24:56.620000 audit: BPF prog-id=77 op=LOAD Dec 16 12:24:56.620000 audit: BPF prog-id=52 op=UNLOAD Dec 16 12:24:56.620000 audit: BPF prog-id=53 op=UNLOAD Dec 16 12:24:56.621000 audit: BPF prog-id=78 op=LOAD Dec 16 12:24:56.621000 audit: BPF prog-id=45 op=UNLOAD Dec 16 12:24:56.621000 audit: BPF prog-id=79 op=LOAD Dec 16 12:24:56.621000 audit: BPF prog-id=80 op=LOAD Dec 16 12:24:56.621000 audit: BPF prog-id=46 op=UNLOAD Dec 16 12:24:56.621000 audit: BPF prog-id=47 op=UNLOAD Dec 16 12:24:56.636170 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Dec 16 12:24:56.636248 systemd[1]: kubelet.service: Failed with result 'signal'. Dec 16 12:24:56.636728 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:24:56.636786 systemd[1]: kubelet.service: Consumed 118ms CPU time, 95M memory peak. Dec 16 12:24:56.636000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Dec 16 12:24:56.639129 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:24:56.809099 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:24:56.809000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:24:56.823847 (kubelet)[2477]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 16 12:24:56.870953 kubelet[2477]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 16 12:24:56.871806 kubelet[2477]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 12:24:56.874313 kubelet[2477]: I1216 12:24:56.872923 2477 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 12:24:58.077743 kubelet[2477]: I1216 12:24:58.077693 2477 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Dec 16 12:24:58.078345 kubelet[2477]: I1216 12:24:58.078323 2477 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 12:24:58.078499 kubelet[2477]: I1216 12:24:58.078484 2477 watchdog_linux.go:95] "Systemd watchdog is not enabled" Dec 16 12:24:58.078573 kubelet[2477]: I1216 12:24:58.078557 2477 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 16 12:24:58.079382 kubelet[2477]: I1216 12:24:58.079342 2477 server.go:956] "Client rotation is on, will bootstrap in background" Dec 16 12:24:58.088562 kubelet[2477]: E1216 12:24:58.088497 2477 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://128.140.49.38:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 128.140.49.38:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Dec 16 12:24:58.090130 kubelet[2477]: I1216 12:24:58.090102 2477 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 12:24:58.095237 kubelet[2477]: I1216 12:24:58.095037 2477 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 12:24:58.100296 kubelet[2477]: I1216 12:24:58.099789 2477 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Dec 16 12:24:58.100296 kubelet[2477]: I1216 12:24:58.100032 2477 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 12:24:58.100460 kubelet[2477]: I1216 12:24:58.100062 2477 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4515-1-0-6-95bdd2e3e7","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 12:24:58.100460 kubelet[2477]: I1216 12:24:58.100361 2477 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 12:24:58.100460 kubelet[2477]: I1216 12:24:58.100370 2477 container_manager_linux.go:306] "Creating device plugin manager" Dec 16 12:24:58.100586 kubelet[2477]: I1216 12:24:58.100480 2477 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Dec 16 12:24:58.103329 kubelet[2477]: I1216 12:24:58.103294 2477 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:24:58.104697 kubelet[2477]: I1216 12:24:58.104660 2477 kubelet.go:475] "Attempting to sync node with API server" Dec 16 12:24:58.104697 kubelet[2477]: I1216 12:24:58.104688 2477 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 12:24:58.104793 kubelet[2477]: I1216 12:24:58.104711 2477 kubelet.go:387] "Adding apiserver pod source" Dec 16 12:24:58.104793 kubelet[2477]: I1216 12:24:58.104724 2477 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 12:24:58.108020 kubelet[2477]: I1216 12:24:58.107980 2477 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Dec 16 12:24:58.108784 kubelet[2477]: I1216 12:24:58.108749 2477 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 16 12:24:58.108884 kubelet[2477]: I1216 12:24:58.108791 2477 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Dec 16 12:24:58.108884 kubelet[2477]: W1216 12:24:58.108876 2477 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Dec 16 12:24:58.118014 kubelet[2477]: I1216 12:24:58.116898 2477 server.go:1262] "Started kubelet" Dec 16 12:24:58.118014 kubelet[2477]: E1216 12:24:58.117109 2477 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://128.140.49.38:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 128.140.49.38:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Dec 16 12:24:58.118570 kubelet[2477]: E1216 12:24:58.118515 2477 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://128.140.49.38:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4515-1-0-6-95bdd2e3e7&limit=500&resourceVersion=0\": dial tcp 128.140.49.38:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Dec 16 12:24:58.119150 kubelet[2477]: I1216 12:24:58.119099 2477 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 12:24:58.121147 kubelet[2477]: I1216 12:24:58.121105 2477 server.go:310] "Adding debug handlers to kubelet server" Dec 16 12:24:58.121435 kubelet[2477]: I1216 12:24:58.121361 2477 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 12:24:58.121510 kubelet[2477]: I1216 12:24:58.121455 2477 server_v1.go:49] "podresources" method="list" useActivePods=true Dec 16 12:24:58.122088 kubelet[2477]: I1216 12:24:58.121818 2477 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 12:24:58.123980 kubelet[2477]: I1216 12:24:58.123954 2477 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 12:24:58.126602 kubelet[2477]: E1216 12:24:58.124808 2477 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://128.140.49.38:6443/api/v1/namespaces/default/events\": dial tcp 128.140.49.38:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4515-1-0-6-95bdd2e3e7.1881b1aff06ab7a5 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4515-1-0-6-95bdd2e3e7,UID:ci-4515-1-0-6-95bdd2e3e7,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4515-1-0-6-95bdd2e3e7,},FirstTimestamp:2025-12-16 12:24:58.116863909 +0000 UTC m=+1.286741454,LastTimestamp:2025-12-16 12:24:58.116863909 +0000 UTC m=+1.286741454,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4515-1-0-6-95bdd2e3e7,}" Dec 16 12:24:58.128682 kubelet[2477]: E1216 12:24:58.128652 2477 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 16 12:24:58.129033 kubelet[2477]: I1216 12:24:58.129002 2477 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 16 12:24:58.128000 audit[2493]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2493 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:58.128000 audit[2493]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=fffff771f520 a2=0 a3=0 items=0 ppid=2477 pid=2493 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:58.128000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Dec 16 12:24:58.129000 audit[2494]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2494 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:58.129000 audit[2494]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe236a030 a2=0 a3=0 items=0 ppid=2477 pid=2494 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:58.129000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4649524557414C4C002D740066696C746572 Dec 16 12:24:58.132331 kubelet[2477]: E1216 12:24:58.132209 2477 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4515-1-0-6-95bdd2e3e7\" not found" Dec 16 12:24:58.132331 kubelet[2477]: I1216 12:24:58.132261 2477 volume_manager.go:313] "Starting Kubelet Volume Manager" Dec 16 12:24:58.133306 kubelet[2477]: I1216 12:24:58.132493 2477 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 16 12:24:58.133306 kubelet[2477]: I1216 12:24:58.132563 2477 reconciler.go:29] "Reconciler: start to sync state" Dec 16 12:24:58.133306 kubelet[2477]: E1216 12:24:58.133034 2477 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://128.140.49.38:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 128.140.49.38:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Dec 16 12:24:58.133412 kubelet[2477]: E1216 12:24:58.133346 2477 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://128.140.49.38:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4515-1-0-6-95bdd2e3e7?timeout=10s\": dial tcp 128.140.49.38:6443: connect: connection refused" interval="200ms" Dec 16 12:24:58.134034 kubelet[2477]: I1216 12:24:58.133999 2477 factory.go:223] Registration of the systemd container factory successfully Dec 16 12:24:58.134111 kubelet[2477]: I1216 12:24:58.134092 2477 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 16 12:24:58.133000 audit[2496]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2496 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:58.133000 audit[2496]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffdfbe1910 a2=0 a3=0 items=0 ppid=2477 pid=2496 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:58.135395 kubelet[2477]: I1216 12:24:58.135343 2477 factory.go:223] Registration of the containerd container factory successfully Dec 16 12:24:58.133000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:24:58.136000 audit[2498]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2498 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:58.136000 audit[2498]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffe9fcdc10 a2=0 a3=0 items=0 ppid=2477 pid=2498 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:58.136000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:24:58.148000 audit[2502]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2502 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:58.148000 audit[2502]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=924 a0=3 a1=fffff70e5e00 a2=0 a3=0 items=0 ppid=2477 pid=2502 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:58.148000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F380000002D2D737263003132372E Dec 16 12:24:58.150457 kubelet[2477]: I1216 12:24:58.150416 2477 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Dec 16 12:24:58.150000 audit[2503]: NETFILTER_CFG table=mangle:47 family=10 entries=2 op=nft_register_chain pid=2503 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:24:58.150000 audit[2503]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=fffffd49b340 a2=0 a3=0 items=0 ppid=2477 pid=2503 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:58.150000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Dec 16 12:24:58.152255 kubelet[2477]: I1216 12:24:58.152213 2477 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Dec 16 12:24:58.152255 kubelet[2477]: I1216 12:24:58.152242 2477 status_manager.go:244] "Starting to sync pod status with apiserver" Dec 16 12:24:58.152376 kubelet[2477]: I1216 12:24:58.152311 2477 kubelet.go:2427] "Starting kubelet main sync loop" Dec 16 12:24:58.152403 kubelet[2477]: E1216 12:24:58.152375 2477 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 12:24:58.152000 audit[2504]: NETFILTER_CFG table=mangle:48 family=2 entries=1 op=nft_register_chain pid=2504 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:58.152000 audit[2504]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc8dd0c90 a2=0 a3=0 items=0 ppid=2477 pid=2504 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:58.152000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Dec 16 12:24:58.153000 audit[2506]: NETFILTER_CFG table=mangle:49 family=10 entries=1 op=nft_register_chain pid=2506 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:24:58.153000 audit[2506]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffffb5ec940 a2=0 a3=0 items=0 ppid=2477 pid=2506 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:58.153000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Dec 16 12:24:58.156941 kubelet[2477]: E1216 12:24:58.156858 2477 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://128.140.49.38:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 128.140.49.38:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Dec 16 12:24:58.158000 audit[2508]: NETFILTER_CFG table=nat:50 family=10 entries=1 op=nft_register_chain pid=2508 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:24:58.158000 audit[2508]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc5a38970 a2=0 a3=0 items=0 ppid=2477 pid=2508 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:58.158000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Dec 16 12:24:58.159000 audit[2511]: NETFILTER_CFG table=filter:51 family=10 entries=1 op=nft_register_chain pid=2511 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:24:58.159000 audit[2511]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffe7527690 a2=0 a3=0 items=0 ppid=2477 pid=2511 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:58.159000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Dec 16 12:24:58.160000 audit[2509]: NETFILTER_CFG table=nat:52 family=2 entries=1 op=nft_register_chain pid=2509 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:58.160000 audit[2509]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc162a650 a2=0 a3=0 items=0 ppid=2477 pid=2509 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:58.160000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Dec 16 12:24:58.163000 audit[2512]: NETFILTER_CFG table=filter:53 family=2 entries=1 op=nft_register_chain pid=2512 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:24:58.163000 audit[2512]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffcbd5ca80 a2=0 a3=0 items=0 ppid=2477 pid=2512 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:58.163000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Dec 16 12:24:58.169263 kubelet[2477]: I1216 12:24:58.169229 2477 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 16 12:24:58.169263 kubelet[2477]: I1216 12:24:58.169255 2477 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 16 12:24:58.169413 kubelet[2477]: I1216 12:24:58.169311 2477 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:24:58.171591 kubelet[2477]: I1216 12:24:58.171522 2477 policy_none.go:49] "None policy: Start" Dec 16 12:24:58.171591 kubelet[2477]: I1216 12:24:58.171545 2477 memory_manager.go:187] "Starting memorymanager" policy="None" Dec 16 12:24:58.171591 kubelet[2477]: I1216 12:24:58.171557 2477 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Dec 16 12:24:58.173255 kubelet[2477]: I1216 12:24:58.173233 2477 policy_none.go:47] "Start" Dec 16 12:24:58.178248 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Dec 16 12:24:58.189295 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Dec 16 12:24:58.193502 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Dec 16 12:24:58.205018 kubelet[2477]: E1216 12:24:58.204975 2477 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 16 12:24:58.206763 kubelet[2477]: I1216 12:24:58.205777 2477 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 12:24:58.206763 kubelet[2477]: I1216 12:24:58.206425 2477 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 12:24:58.208377 kubelet[2477]: I1216 12:24:58.208296 2477 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 12:24:58.209557 kubelet[2477]: E1216 12:24:58.209413 2477 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 16 12:24:58.209557 kubelet[2477]: E1216 12:24:58.209555 2477 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4515-1-0-6-95bdd2e3e7\" not found" Dec 16 12:24:58.269376 systemd[1]: Created slice kubepods-burstable-pod2f466efe848e856ca7c9bf6e56d3ab93.slice - libcontainer container kubepods-burstable-pod2f466efe848e856ca7c9bf6e56d3ab93.slice. Dec 16 12:24:58.277492 kubelet[2477]: E1216 12:24:58.277456 2477 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-6-95bdd2e3e7\" not found" node="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:24:58.281932 systemd[1]: Created slice kubepods-burstable-podf1239b6595f69f17e5b85ca50f062d71.slice - libcontainer container kubepods-burstable-podf1239b6595f69f17e5b85ca50f062d71.slice. Dec 16 12:24:58.284975 kubelet[2477]: E1216 12:24:58.284909 2477 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-6-95bdd2e3e7\" not found" node="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:24:58.292941 systemd[1]: Created slice kubepods-burstable-pod8680ee0b60674190213cb19d68639c50.slice - libcontainer container kubepods-burstable-pod8680ee0b60674190213cb19d68639c50.slice. Dec 16 12:24:58.298051 kubelet[2477]: E1216 12:24:58.298016 2477 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-6-95bdd2e3e7\" not found" node="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:24:58.309596 kubelet[2477]: I1216 12:24:58.309515 2477 kubelet_node_status.go:75] "Attempting to register node" node="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:24:58.310114 kubelet[2477]: E1216 12:24:58.310062 2477 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://128.140.49.38:6443/api/v1/nodes\": dial tcp 128.140.49.38:6443: connect: connection refused" node="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:24:58.336975 kubelet[2477]: I1216 12:24:58.334521 2477 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/f1239b6595f69f17e5b85ca50f062d71-flexvolume-dir\") pod \"kube-controller-manager-ci-4515-1-0-6-95bdd2e3e7\" (UID: \"f1239b6595f69f17e5b85ca50f062d71\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:24:58.337254 kubelet[2477]: I1216 12:24:58.337219 2477 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f1239b6595f69f17e5b85ca50f062d71-k8s-certs\") pod \"kube-controller-manager-ci-4515-1-0-6-95bdd2e3e7\" (UID: \"f1239b6595f69f17e5b85ca50f062d71\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:24:58.337691 kubelet[2477]: I1216 12:24:58.337477 2477 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f1239b6595f69f17e5b85ca50f062d71-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4515-1-0-6-95bdd2e3e7\" (UID: \"f1239b6595f69f17e5b85ca50f062d71\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:24:58.337691 kubelet[2477]: I1216 12:24:58.337519 2477 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2f466efe848e856ca7c9bf6e56d3ab93-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4515-1-0-6-95bdd2e3e7\" (UID: \"2f466efe848e856ca7c9bf6e56d3ab93\") " pod="kube-system/kube-apiserver-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:24:58.337691 kubelet[2477]: I1216 12:24:58.337548 2477 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/f1239b6595f69f17e5b85ca50f062d71-kubeconfig\") pod \"kube-controller-manager-ci-4515-1-0-6-95bdd2e3e7\" (UID: \"f1239b6595f69f17e5b85ca50f062d71\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:24:58.337691 kubelet[2477]: I1216 12:24:58.337568 2477 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8680ee0b60674190213cb19d68639c50-kubeconfig\") pod \"kube-scheduler-ci-4515-1-0-6-95bdd2e3e7\" (UID: \"8680ee0b60674190213cb19d68639c50\") " pod="kube-system/kube-scheduler-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:24:58.337691 kubelet[2477]: I1216 12:24:58.337588 2477 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2f466efe848e856ca7c9bf6e56d3ab93-ca-certs\") pod \"kube-apiserver-ci-4515-1-0-6-95bdd2e3e7\" (UID: \"2f466efe848e856ca7c9bf6e56d3ab93\") " pod="kube-system/kube-apiserver-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:24:58.337932 kubelet[2477]: I1216 12:24:58.337604 2477 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2f466efe848e856ca7c9bf6e56d3ab93-k8s-certs\") pod \"kube-apiserver-ci-4515-1-0-6-95bdd2e3e7\" (UID: \"2f466efe848e856ca7c9bf6e56d3ab93\") " pod="kube-system/kube-apiserver-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:24:58.337932 kubelet[2477]: E1216 12:24:58.337582 2477 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://128.140.49.38:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4515-1-0-6-95bdd2e3e7?timeout=10s\": dial tcp 128.140.49.38:6443: connect: connection refused" interval="400ms" Dec 16 12:24:58.337932 kubelet[2477]: I1216 12:24:58.337622 2477 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f1239b6595f69f17e5b85ca50f062d71-ca-certs\") pod \"kube-controller-manager-ci-4515-1-0-6-95bdd2e3e7\" (UID: \"f1239b6595f69f17e5b85ca50f062d71\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:24:58.514066 kubelet[2477]: I1216 12:24:58.513980 2477 kubelet_node_status.go:75] "Attempting to register node" node="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:24:58.514941 kubelet[2477]: E1216 12:24:58.514893 2477 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://128.140.49.38:6443/api/v1/nodes\": dial tcp 128.140.49.38:6443: connect: connection refused" node="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:24:58.582689 containerd[1611]: time="2025-12-16T12:24:58.582340864Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4515-1-0-6-95bdd2e3e7,Uid:2f466efe848e856ca7c9bf6e56d3ab93,Namespace:kube-system,Attempt:0,}" Dec 16 12:24:58.587562 containerd[1611]: time="2025-12-16T12:24:58.587305869Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4515-1-0-6-95bdd2e3e7,Uid:f1239b6595f69f17e5b85ca50f062d71,Namespace:kube-system,Attempt:0,}" Dec 16 12:24:58.601084 containerd[1611]: time="2025-12-16T12:24:58.601003676Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4515-1-0-6-95bdd2e3e7,Uid:8680ee0b60674190213cb19d68639c50,Namespace:kube-system,Attempt:0,}" Dec 16 12:24:58.739250 kubelet[2477]: E1216 12:24:58.739200 2477 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://128.140.49.38:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4515-1-0-6-95bdd2e3e7?timeout=10s\": dial tcp 128.140.49.38:6443: connect: connection refused" interval="800ms" Dec 16 12:24:58.918258 kubelet[2477]: I1216 12:24:58.917566 2477 kubelet_node_status.go:75] "Attempting to register node" node="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:24:58.918258 kubelet[2477]: E1216 12:24:58.918080 2477 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://128.140.49.38:6443/api/v1/nodes\": dial tcp 128.140.49.38:6443: connect: connection refused" node="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:24:59.013863 kubelet[2477]: E1216 12:24:59.013787 2477 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://128.140.49.38:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4515-1-0-6-95bdd2e3e7&limit=500&resourceVersion=0\": dial tcp 128.140.49.38:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Dec 16 12:24:59.055882 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2146783008.mount: Deactivated successfully. Dec 16 12:24:59.062243 containerd[1611]: time="2025-12-16T12:24:59.062173806Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:24:59.065014 containerd[1611]: time="2025-12-16T12:24:59.064943130Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Dec 16 12:24:59.068164 containerd[1611]: time="2025-12-16T12:24:59.068081444Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:24:59.069603 containerd[1611]: time="2025-12-16T12:24:59.069545364Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:24:59.071041 containerd[1611]: time="2025-12-16T12:24:59.070978685Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Dec 16 12:24:59.073333 containerd[1611]: time="2025-12-16T12:24:59.073239703Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:24:59.074939 containerd[1611]: time="2025-12-16T12:24:59.074866019Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Dec 16 12:24:59.075578 containerd[1611]: time="2025-12-16T12:24:59.075494481Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Dec 16 12:24:59.077239 kubelet[2477]: E1216 12:24:59.077186 2477 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://128.140.49.38:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 128.140.49.38:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Dec 16 12:24:59.077681 containerd[1611]: time="2025-12-16T12:24:59.077614903Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 491.163724ms" Dec 16 12:24:59.079092 containerd[1611]: time="2025-12-16T12:24:59.079021145Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 488.672144ms" Dec 16 12:24:59.088886 containerd[1611]: time="2025-12-16T12:24:59.088748558Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 484.811497ms" Dec 16 12:24:59.115388 containerd[1611]: time="2025-12-16T12:24:59.115006719Z" level=info msg="connecting to shim 21747d8025f675998a0a181375c4f89924d0b45b4b62cdf42d2476880db4699a" address="unix:///run/containerd/s/6e194d5f3b6751fdfeea2092232d3eeeeb5274f109d22284aba24389259e9d83" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:24:59.119565 kubelet[2477]: E1216 12:24:59.118767 2477 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://128.140.49.38:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 128.140.49.38:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Dec 16 12:24:59.123717 kubelet[2477]: E1216 12:24:59.123586 2477 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://128.140.49.38:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 128.140.49.38:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Dec 16 12:24:59.131700 containerd[1611]: time="2025-12-16T12:24:59.131558706Z" level=info msg="connecting to shim c95a28c18f937fce429981a73a7447764cae446962ae4e6e1607435033dcf235" address="unix:///run/containerd/s/9347e689c21c6af36ff4bff80b464dbc4689e5ce197d794ae314d09dd4cea5ad" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:24:59.143147 containerd[1611]: time="2025-12-16T12:24:59.142591803Z" level=info msg="connecting to shim ff08975cdae3f5e1935424f76b0f1114521e1da3759d6a5d20316fdc0113cc53" address="unix:///run/containerd/s/acc50c7990515e543e73ffe1b9e37f51dc5628619c3114f970a9a20107b9be26" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:24:59.160667 systemd[1]: Started cri-containerd-21747d8025f675998a0a181375c4f89924d0b45b4b62cdf42d2476880db4699a.scope - libcontainer container 21747d8025f675998a0a181375c4f89924d0b45b4b62cdf42d2476880db4699a. Dec 16 12:24:59.180993 systemd[1]: Started cri-containerd-c95a28c18f937fce429981a73a7447764cae446962ae4e6e1607435033dcf235.scope - libcontainer container c95a28c18f937fce429981a73a7447764cae446962ae4e6e1607435033dcf235. Dec 16 12:24:59.184000 audit: BPF prog-id=81 op=LOAD Dec 16 12:24:59.189000 audit: BPF prog-id=82 op=LOAD Dec 16 12:24:59.189000 audit[2548]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40000fe180 a2=98 a3=0 items=0 ppid=2527 pid=2548 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.189000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3231373437643830323566363735393938613061313831333735633466 Dec 16 12:24:59.189000 audit: BPF prog-id=82 op=UNLOAD Dec 16 12:24:59.189000 audit[2548]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2527 pid=2548 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.189000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3231373437643830323566363735393938613061313831333735633466 Dec 16 12:24:59.189000 audit: BPF prog-id=83 op=LOAD Dec 16 12:24:59.189000 audit[2548]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40000fe3e8 a2=98 a3=0 items=0 ppid=2527 pid=2548 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.189000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3231373437643830323566363735393938613061313831333735633466 Dec 16 12:24:59.189000 audit: BPF prog-id=84 op=LOAD Dec 16 12:24:59.189000 audit[2548]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=40000fe168 a2=98 a3=0 items=0 ppid=2527 pid=2548 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.189000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3231373437643830323566363735393938613061313831333735633466 Dec 16 12:24:59.190000 audit: BPF prog-id=84 op=UNLOAD Dec 16 12:24:59.190000 audit[2548]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2527 pid=2548 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.190000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3231373437643830323566363735393938613061313831333735633466 Dec 16 12:24:59.190000 audit: BPF prog-id=83 op=UNLOAD Dec 16 12:24:59.190000 audit[2548]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2527 pid=2548 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.190000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3231373437643830323566363735393938613061313831333735633466 Dec 16 12:24:59.190000 audit: BPF prog-id=85 op=LOAD Dec 16 12:24:59.190000 audit[2548]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40000fe648 a2=98 a3=0 items=0 ppid=2527 pid=2548 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.190000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3231373437643830323566363735393938613061313831333735633466 Dec 16 12:24:59.201543 systemd[1]: Started cri-containerd-ff08975cdae3f5e1935424f76b0f1114521e1da3759d6a5d20316fdc0113cc53.scope - libcontainer container ff08975cdae3f5e1935424f76b0f1114521e1da3759d6a5d20316fdc0113cc53. Dec 16 12:24:59.204000 audit: BPF prog-id=86 op=LOAD Dec 16 12:24:59.205000 audit: BPF prog-id=87 op=LOAD Dec 16 12:24:59.205000 audit[2575]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=2545 pid=2575 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.205000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339356132386331386639333766636534323939383161373361373434 Dec 16 12:24:59.206000 audit: BPF prog-id=87 op=UNLOAD Dec 16 12:24:59.206000 audit[2575]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2545 pid=2575 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.206000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339356132386331386639333766636534323939383161373361373434 Dec 16 12:24:59.206000 audit: BPF prog-id=88 op=LOAD Dec 16 12:24:59.206000 audit[2575]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=2545 pid=2575 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.206000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339356132386331386639333766636534323939383161373361373434 Dec 16 12:24:59.207000 audit: BPF prog-id=89 op=LOAD Dec 16 12:24:59.207000 audit[2575]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=40001a0168 a2=98 a3=0 items=0 ppid=2545 pid=2575 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.207000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339356132386331386639333766636534323939383161373361373434 Dec 16 12:24:59.208000 audit: BPF prog-id=89 op=UNLOAD Dec 16 12:24:59.208000 audit[2575]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2545 pid=2575 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.208000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339356132386331386639333766636534323939383161373361373434 Dec 16 12:24:59.208000 audit: BPF prog-id=88 op=UNLOAD Dec 16 12:24:59.208000 audit[2575]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2545 pid=2575 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.208000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339356132386331386639333766636534323939383161373361373434 Dec 16 12:24:59.209000 audit: BPF prog-id=90 op=LOAD Dec 16 12:24:59.209000 audit[2575]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001a0648 a2=98 a3=0 items=0 ppid=2545 pid=2575 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.209000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6339356132386331386639333766636534323939383161373361373434 Dec 16 12:24:59.224000 audit: BPF prog-id=91 op=LOAD Dec 16 12:24:59.225000 audit: BPF prog-id=92 op=LOAD Dec 16 12:24:59.225000 audit[2610]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000176180 a2=98 a3=0 items=0 ppid=2565 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.225000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6666303839373563646165336635653139333534323466373662306631 Dec 16 12:24:59.225000 audit: BPF prog-id=92 op=UNLOAD Dec 16 12:24:59.225000 audit[2610]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2565 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.225000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6666303839373563646165336635653139333534323466373662306631 Dec 16 12:24:59.226000 audit: BPF prog-id=93 op=LOAD Dec 16 12:24:59.226000 audit[2610]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=2565 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.226000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6666303839373563646165336635653139333534323466373662306631 Dec 16 12:24:59.226000 audit: BPF prog-id=94 op=LOAD Dec 16 12:24:59.226000 audit[2610]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=2565 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.226000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6666303839373563646165336635653139333534323466373662306631 Dec 16 12:24:59.227000 audit: BPF prog-id=94 op=UNLOAD Dec 16 12:24:59.227000 audit[2610]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2565 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.227000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6666303839373563646165336635653139333534323466373662306631 Dec 16 12:24:59.227000 audit: BPF prog-id=93 op=UNLOAD Dec 16 12:24:59.227000 audit[2610]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2565 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.227000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6666303839373563646165336635653139333534323466373662306631 Dec 16 12:24:59.227000 audit: BPF prog-id=95 op=LOAD Dec 16 12:24:59.227000 audit[2610]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000176648 a2=98 a3=0 items=0 ppid=2565 pid=2610 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.227000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6666303839373563646165336635653139333534323466373662306631 Dec 16 12:24:59.244699 containerd[1611]: time="2025-12-16T12:24:59.244635128Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4515-1-0-6-95bdd2e3e7,Uid:2f466efe848e856ca7c9bf6e56d3ab93,Namespace:kube-system,Attempt:0,} returns sandbox id \"21747d8025f675998a0a181375c4f89924d0b45b4b62cdf42d2476880db4699a\"" Dec 16 12:24:59.257800 containerd[1611]: time="2025-12-16T12:24:59.257523215Z" level=info msg="CreateContainer within sandbox \"21747d8025f675998a0a181375c4f89924d0b45b4b62cdf42d2476880db4699a\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Dec 16 12:24:59.258423 containerd[1611]: time="2025-12-16T12:24:59.258394551Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4515-1-0-6-95bdd2e3e7,Uid:8680ee0b60674190213cb19d68639c50,Namespace:kube-system,Attempt:0,} returns sandbox id \"c95a28c18f937fce429981a73a7447764cae446962ae4e6e1607435033dcf235\"" Dec 16 12:24:59.266232 containerd[1611]: time="2025-12-16T12:24:59.266186418Z" level=info msg="CreateContainer within sandbox \"c95a28c18f937fce429981a73a7447764cae446962ae4e6e1607435033dcf235\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Dec 16 12:24:59.269784 containerd[1611]: time="2025-12-16T12:24:59.269738520Z" level=info msg="Container 6e4c6b6f85bfd8ed14e59379a2fbf8d3e04d634c80532faf5a51778264e5acd3: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:24:59.279307 containerd[1611]: time="2025-12-16T12:24:59.278901349Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4515-1-0-6-95bdd2e3e7,Uid:f1239b6595f69f17e5b85ca50f062d71,Namespace:kube-system,Attempt:0,} returns sandbox id \"ff08975cdae3f5e1935424f76b0f1114521e1da3759d6a5d20316fdc0113cc53\"" Dec 16 12:24:59.279910 containerd[1611]: time="2025-12-16T12:24:59.279614290Z" level=info msg="CreateContainer within sandbox \"21747d8025f675998a0a181375c4f89924d0b45b4b62cdf42d2476880db4699a\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"6e4c6b6f85bfd8ed14e59379a2fbf8d3e04d634c80532faf5a51778264e5acd3\"" Dec 16 12:24:59.280905 containerd[1611]: time="2025-12-16T12:24:59.280847096Z" level=info msg="StartContainer for \"6e4c6b6f85bfd8ed14e59379a2fbf8d3e04d634c80532faf5a51778264e5acd3\"" Dec 16 12:24:59.286696 containerd[1611]: time="2025-12-16T12:24:59.286636938Z" level=info msg="connecting to shim 6e4c6b6f85bfd8ed14e59379a2fbf8d3e04d634c80532faf5a51778264e5acd3" address="unix:///run/containerd/s/6e194d5f3b6751fdfeea2092232d3eeeeb5274f109d22284aba24389259e9d83" protocol=ttrpc version=3 Dec 16 12:24:59.287082 containerd[1611]: time="2025-12-16T12:24:59.287045246Z" level=info msg="Container a12f5570743db02b9efd7babefdcc55946361bf7e6ca3d348ab6f3d8e5cc5702: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:24:59.289841 containerd[1611]: time="2025-12-16T12:24:59.289796171Z" level=info msg="CreateContainer within sandbox \"ff08975cdae3f5e1935424f76b0f1114521e1da3759d6a5d20316fdc0113cc53\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Dec 16 12:24:59.299782 containerd[1611]: time="2025-12-16T12:24:59.299516305Z" level=info msg="CreateContainer within sandbox \"c95a28c18f937fce429981a73a7447764cae446962ae4e6e1607435033dcf235\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"a12f5570743db02b9efd7babefdcc55946361bf7e6ca3d348ab6f3d8e5cc5702\"" Dec 16 12:24:59.301817 containerd[1611]: time="2025-12-16T12:24:59.301736324Z" level=info msg="StartContainer for \"a12f5570743db02b9efd7babefdcc55946361bf7e6ca3d348ab6f3d8e5cc5702\"" Dec 16 12:24:59.304194 containerd[1611]: time="2025-12-16T12:24:59.304144778Z" level=info msg="Container 992734cc631dbb514914477815b43e33567fd3c04bebc4483f9cf2363f708b50: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:24:59.305170 containerd[1611]: time="2025-12-16T12:24:59.305007834Z" level=info msg="connecting to shim a12f5570743db02b9efd7babefdcc55946361bf7e6ca3d348ab6f3d8e5cc5702" address="unix:///run/containerd/s/9347e689c21c6af36ff4bff80b464dbc4689e5ce197d794ae314d09dd4cea5ad" protocol=ttrpc version=3 Dec 16 12:24:59.318670 containerd[1611]: time="2025-12-16T12:24:59.318101076Z" level=info msg="CreateContainer within sandbox \"ff08975cdae3f5e1935424f76b0f1114521e1da3759d6a5d20316fdc0113cc53\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"992734cc631dbb514914477815b43e33567fd3c04bebc4483f9cf2363f708b50\"" Dec 16 12:24:59.318898 containerd[1611]: time="2025-12-16T12:24:59.318602862Z" level=info msg="StartContainer for \"992734cc631dbb514914477815b43e33567fd3c04bebc4483f9cf2363f708b50\"" Dec 16 12:24:59.320600 systemd[1]: Started cri-containerd-6e4c6b6f85bfd8ed14e59379a2fbf8d3e04d634c80532faf5a51778264e5acd3.scope - libcontainer container 6e4c6b6f85bfd8ed14e59379a2fbf8d3e04d634c80532faf5a51778264e5acd3. Dec 16 12:24:59.322193 containerd[1611]: time="2025-12-16T12:24:59.322141645Z" level=info msg="connecting to shim 992734cc631dbb514914477815b43e33567fd3c04bebc4483f9cf2363f708b50" address="unix:///run/containerd/s/acc50c7990515e543e73ffe1b9e37f51dc5628619c3114f970a9a20107b9be26" protocol=ttrpc version=3 Dec 16 12:24:59.342550 systemd[1]: Started cri-containerd-a12f5570743db02b9efd7babefdcc55946361bf7e6ca3d348ab6f3d8e5cc5702.scope - libcontainer container a12f5570743db02b9efd7babefdcc55946361bf7e6ca3d348ab6f3d8e5cc5702. Dec 16 12:24:59.345000 audit: BPF prog-id=96 op=LOAD Dec 16 12:24:59.347000 audit: BPF prog-id=97 op=LOAD Dec 16 12:24:59.347000 audit[2660]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176180 a2=98 a3=0 items=0 ppid=2527 pid=2660 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.347000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3665346336623666383562666438656431346535393337396132666266 Dec 16 12:24:59.347000 audit: BPF prog-id=97 op=UNLOAD Dec 16 12:24:59.347000 audit[2660]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2527 pid=2660 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.347000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3665346336623666383562666438656431346535393337396132666266 Dec 16 12:24:59.347000 audit: BPF prog-id=98 op=LOAD Dec 16 12:24:59.347000 audit[2660]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=2527 pid=2660 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.347000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3665346336623666383562666438656431346535393337396132666266 Dec 16 12:24:59.347000 audit: BPF prog-id=99 op=LOAD Dec 16 12:24:59.347000 audit[2660]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=2527 pid=2660 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.347000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3665346336623666383562666438656431346535393337396132666266 Dec 16 12:24:59.347000 audit: BPF prog-id=99 op=UNLOAD Dec 16 12:24:59.347000 audit[2660]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2527 pid=2660 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.347000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3665346336623666383562666438656431346535393337396132666266 Dec 16 12:24:59.347000 audit: BPF prog-id=98 op=UNLOAD Dec 16 12:24:59.347000 audit[2660]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2527 pid=2660 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.347000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3665346336623666383562666438656431346535393337396132666266 Dec 16 12:24:59.347000 audit: BPF prog-id=100 op=LOAD Dec 16 12:24:59.347000 audit[2660]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176648 a2=98 a3=0 items=0 ppid=2527 pid=2660 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.347000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3665346336623666383562666438656431346535393337396132666266 Dec 16 12:24:59.359523 systemd[1]: Started cri-containerd-992734cc631dbb514914477815b43e33567fd3c04bebc4483f9cf2363f708b50.scope - libcontainer container 992734cc631dbb514914477815b43e33567fd3c04bebc4483f9cf2363f708b50. Dec 16 12:24:59.370000 audit: BPF prog-id=101 op=LOAD Dec 16 12:24:59.371000 audit: BPF prog-id=102 op=LOAD Dec 16 12:24:59.371000 audit[2676]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40000fe180 a2=98 a3=0 items=0 ppid=2545 pid=2676 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.371000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6131326635353730373433646230326239656664376261626566646363 Dec 16 12:24:59.373000 audit: BPF prog-id=102 op=UNLOAD Dec 16 12:24:59.373000 audit[2676]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2545 pid=2676 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.373000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6131326635353730373433646230326239656664376261626566646363 Dec 16 12:24:59.375000 audit: BPF prog-id=103 op=LOAD Dec 16 12:24:59.375000 audit[2676]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40000fe3e8 a2=98 a3=0 items=0 ppid=2545 pid=2676 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.375000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6131326635353730373433646230326239656664376261626566646363 Dec 16 12:24:59.376000 audit: BPF prog-id=104 op=LOAD Dec 16 12:24:59.376000 audit[2676]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40000fe168 a2=98 a3=0 items=0 ppid=2545 pid=2676 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.376000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6131326635353730373433646230326239656664376261626566646363 Dec 16 12:24:59.377000 audit: BPF prog-id=104 op=UNLOAD Dec 16 12:24:59.377000 audit[2676]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2545 pid=2676 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.377000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6131326635353730373433646230326239656664376261626566646363 Dec 16 12:24:59.377000 audit: BPF prog-id=103 op=UNLOAD Dec 16 12:24:59.377000 audit[2676]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2545 pid=2676 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.377000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6131326635353730373433646230326239656664376261626566646363 Dec 16 12:24:59.378000 audit: BPF prog-id=105 op=LOAD Dec 16 12:24:59.378000 audit[2676]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40000fe648 a2=98 a3=0 items=0 ppid=2545 pid=2676 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.378000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6131326635353730373433646230326239656664376261626566646363 Dec 16 12:24:59.380000 audit: BPF prog-id=106 op=LOAD Dec 16 12:24:59.381000 audit: BPF prog-id=107 op=LOAD Dec 16 12:24:59.381000 audit[2684]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000220180 a2=98 a3=0 items=0 ppid=2565 pid=2684 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.381000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939323733346363363331646262353134393134343737383135623433 Dec 16 12:24:59.381000 audit: BPF prog-id=107 op=UNLOAD Dec 16 12:24:59.381000 audit[2684]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2565 pid=2684 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.381000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939323733346363363331646262353134393134343737383135623433 Dec 16 12:24:59.382000 audit: BPF prog-id=108 op=LOAD Dec 16 12:24:59.382000 audit[2684]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40002203e8 a2=98 a3=0 items=0 ppid=2565 pid=2684 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.382000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939323733346363363331646262353134393134343737383135623433 Dec 16 12:24:59.382000 audit: BPF prog-id=109 op=LOAD Dec 16 12:24:59.382000 audit[2684]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000220168 a2=98 a3=0 items=0 ppid=2565 pid=2684 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.382000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939323733346363363331646262353134393134343737383135623433 Dec 16 12:24:59.382000 audit: BPF prog-id=109 op=UNLOAD Dec 16 12:24:59.382000 audit[2684]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2565 pid=2684 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.382000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939323733346363363331646262353134393134343737383135623433 Dec 16 12:24:59.383000 audit: BPF prog-id=108 op=UNLOAD Dec 16 12:24:59.383000 audit[2684]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2565 pid=2684 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.383000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939323733346363363331646262353134393134343737383135623433 Dec 16 12:24:59.383000 audit: BPF prog-id=110 op=LOAD Dec 16 12:24:59.383000 audit[2684]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000220648 a2=98 a3=0 items=0 ppid=2565 pid=2684 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:24:59.383000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3939323733346363363331646262353134393134343737383135623433 Dec 16 12:24:59.423245 containerd[1611]: time="2025-12-16T12:24:59.423187197Z" level=info msg="StartContainer for \"6e4c6b6f85bfd8ed14e59379a2fbf8d3e04d634c80532faf5a51778264e5acd3\" returns successfully" Dec 16 12:24:59.438755 containerd[1611]: time="2025-12-16T12:24:59.438633494Z" level=info msg="StartContainer for \"992734cc631dbb514914477815b43e33567fd3c04bebc4483f9cf2363f708b50\" returns successfully" Dec 16 12:24:59.449418 containerd[1611]: time="2025-12-16T12:24:59.449362520Z" level=info msg="StartContainer for \"a12f5570743db02b9efd7babefdcc55946361bf7e6ca3d348ab6f3d8e5cc5702\" returns successfully" Dec 16 12:24:59.540641 kubelet[2477]: E1216 12:24:59.540591 2477 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://128.140.49.38:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4515-1-0-6-95bdd2e3e7?timeout=10s\": dial tcp 128.140.49.38:6443: connect: connection refused" interval="1.6s" Dec 16 12:24:59.721459 kubelet[2477]: I1216 12:24:59.720094 2477 kubelet_node_status.go:75] "Attempting to register node" node="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:00.172765 kubelet[2477]: E1216 12:25:00.172618 2477 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-6-95bdd2e3e7\" not found" node="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:00.178960 kubelet[2477]: E1216 12:25:00.178872 2477 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-6-95bdd2e3e7\" not found" node="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:00.183486 kubelet[2477]: E1216 12:25:00.183451 2477 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-6-95bdd2e3e7\" not found" node="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:01.184535 kubelet[2477]: E1216 12:25:01.184502 2477 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-6-95bdd2e3e7\" not found" node="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:01.185413 kubelet[2477]: E1216 12:25:01.185346 2477 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-6-95bdd2e3e7\" not found" node="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:02.110308 kubelet[2477]: I1216 12:25:02.110049 2477 apiserver.go:52] "Watching apiserver" Dec 16 12:25:02.177880 kubelet[2477]: E1216 12:25:02.177839 2477 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4515-1-0-6-95bdd2e3e7\" not found" node="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:02.189752 kubelet[2477]: E1216 12:25:02.189709 2477 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4515-1-0-6-95bdd2e3e7\" not found" node="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:02.233462 kubelet[2477]: I1216 12:25:02.233413 2477 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 16 12:25:02.254492 kubelet[2477]: I1216 12:25:02.254442 2477 kubelet_node_status.go:78] "Successfully registered node" node="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:02.333969 kubelet[2477]: I1216 12:25:02.333895 2477 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:02.345636 kubelet[2477]: E1216 12:25:02.345533 2477 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4515-1-0-6-95bdd2e3e7\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:02.345636 kubelet[2477]: I1216 12:25:02.345629 2477 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:02.351521 kubelet[2477]: E1216 12:25:02.351463 2477 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4515-1-0-6-95bdd2e3e7\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:02.351521 kubelet[2477]: I1216 12:25:02.351509 2477 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:02.361635 kubelet[2477]: E1216 12:25:02.361490 2477 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4515-1-0-6-95bdd2e3e7\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:03.325054 kubelet[2477]: I1216 12:25:03.325002 2477 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:04.274568 systemd[1]: Reload requested from client PID 2766 ('systemctl') (unit session-7.scope)... Dec 16 12:25:04.274602 systemd[1]: Reloading... Dec 16 12:25:04.395318 zram_generator::config[2819]: No configuration found. Dec 16 12:25:04.644964 systemd[1]: Reloading finished in 369 ms. Dec 16 12:25:04.683383 kubelet[2477]: I1216 12:25:04.682221 2477 dynamic_cafile_content.go:175] "Shutting down controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 12:25:04.682352 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:25:04.699852 systemd[1]: kubelet.service: Deactivated successfully. Dec 16 12:25:04.700322 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:25:04.700422 systemd[1]: kubelet.service: Consumed 1.763s CPU time, 121.4M memory peak. Dec 16 12:25:04.703006 kernel: kauditd_printk_skb: 202 callbacks suppressed Dec 16 12:25:04.703100 kernel: audit: type=1131 audit(1765887904.699:389): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:25:04.699000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:25:04.707965 kernel: audit: type=1334 audit(1765887904.704:390): prog-id=111 op=LOAD Dec 16 12:25:04.708080 kernel: audit: type=1334 audit(1765887904.706:391): prog-id=112 op=LOAD Dec 16 12:25:04.708104 kernel: audit: type=1334 audit(1765887904.706:392): prog-id=69 op=UNLOAD Dec 16 12:25:04.708124 kernel: audit: type=1334 audit(1765887904.706:393): prog-id=70 op=UNLOAD Dec 16 12:25:04.704000 audit: BPF prog-id=111 op=LOAD Dec 16 12:25:04.706000 audit: BPF prog-id=112 op=LOAD Dec 16 12:25:04.706000 audit: BPF prog-id=69 op=UNLOAD Dec 16 12:25:04.706000 audit: BPF prog-id=70 op=UNLOAD Dec 16 12:25:04.705573 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Dec 16 12:25:04.712994 kernel: audit: type=1334 audit(1765887904.707:394): prog-id=113 op=LOAD Dec 16 12:25:04.713104 kernel: audit: type=1334 audit(1765887904.707:395): prog-id=78 op=UNLOAD Dec 16 12:25:04.713127 kernel: audit: type=1334 audit(1765887904.707:396): prog-id=114 op=LOAD Dec 16 12:25:04.713146 kernel: audit: type=1334 audit(1765887904.707:397): prog-id=115 op=LOAD Dec 16 12:25:04.713167 kernel: audit: type=1334 audit(1765887904.707:398): prog-id=79 op=UNLOAD Dec 16 12:25:04.707000 audit: BPF prog-id=113 op=LOAD Dec 16 12:25:04.707000 audit: BPF prog-id=78 op=UNLOAD Dec 16 12:25:04.707000 audit: BPF prog-id=114 op=LOAD Dec 16 12:25:04.707000 audit: BPF prog-id=115 op=LOAD Dec 16 12:25:04.707000 audit: BPF prog-id=79 op=UNLOAD Dec 16 12:25:04.707000 audit: BPF prog-id=80 op=UNLOAD Dec 16 12:25:04.708000 audit: BPF prog-id=116 op=LOAD Dec 16 12:25:04.708000 audit: BPF prog-id=61 op=UNLOAD Dec 16 12:25:04.710000 audit: BPF prog-id=117 op=LOAD Dec 16 12:25:04.712000 audit: BPF prog-id=118 op=LOAD Dec 16 12:25:04.712000 audit: BPF prog-id=62 op=UNLOAD Dec 16 12:25:04.712000 audit: BPF prog-id=63 op=UNLOAD Dec 16 12:25:04.712000 audit: BPF prog-id=119 op=LOAD Dec 16 12:25:04.712000 audit: BPF prog-id=75 op=UNLOAD Dec 16 12:25:04.712000 audit: BPF prog-id=120 op=LOAD Dec 16 12:25:04.712000 audit: BPF prog-id=121 op=LOAD Dec 16 12:25:04.712000 audit: BPF prog-id=76 op=UNLOAD Dec 16 12:25:04.712000 audit: BPF prog-id=77 op=UNLOAD Dec 16 12:25:04.720000 audit: BPF prog-id=122 op=LOAD Dec 16 12:25:04.720000 audit: BPF prog-id=65 op=UNLOAD Dec 16 12:25:04.722000 audit: BPF prog-id=123 op=LOAD Dec 16 12:25:04.722000 audit: BPF prog-id=64 op=UNLOAD Dec 16 12:25:04.724000 audit: BPF prog-id=124 op=LOAD Dec 16 12:25:04.724000 audit: BPF prog-id=71 op=UNLOAD Dec 16 12:25:04.724000 audit: BPF prog-id=125 op=LOAD Dec 16 12:25:04.725000 audit: BPF prog-id=126 op=LOAD Dec 16 12:25:04.725000 audit: BPF prog-id=72 op=UNLOAD Dec 16 12:25:04.725000 audit: BPF prog-id=73 op=UNLOAD Dec 16 12:25:04.726000 audit: BPF prog-id=127 op=LOAD Dec 16 12:25:04.726000 audit: BPF prog-id=66 op=UNLOAD Dec 16 12:25:04.726000 audit: BPF prog-id=128 op=LOAD Dec 16 12:25:04.726000 audit: BPF prog-id=129 op=LOAD Dec 16 12:25:04.726000 audit: BPF prog-id=67 op=UNLOAD Dec 16 12:25:04.726000 audit: BPF prog-id=68 op=UNLOAD Dec 16 12:25:04.727000 audit: BPF prog-id=130 op=LOAD Dec 16 12:25:04.727000 audit: BPF prog-id=74 op=UNLOAD Dec 16 12:25:04.870966 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Dec 16 12:25:04.870000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:25:04.884213 (kubelet)[2858]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Dec 16 12:25:04.935053 kubelet[2858]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Dec 16 12:25:04.935053 kubelet[2858]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Dec 16 12:25:04.935053 kubelet[2858]: I1216 12:25:04.934787 2858 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Dec 16 12:25:04.949906 kubelet[2858]: I1216 12:25:04.949533 2858 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Dec 16 12:25:04.950825 kubelet[2858]: I1216 12:25:04.950548 2858 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Dec 16 12:25:04.951204 kubelet[2858]: I1216 12:25:04.951105 2858 watchdog_linux.go:95] "Systemd watchdog is not enabled" Dec 16 12:25:04.951623 kubelet[2858]: I1216 12:25:04.951519 2858 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Dec 16 12:25:04.954345 kubelet[2858]: I1216 12:25:04.952672 2858 server.go:956] "Client rotation is on, will bootstrap in background" Dec 16 12:25:04.955763 kubelet[2858]: I1216 12:25:04.955154 2858 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Dec 16 12:25:04.959988 kubelet[2858]: I1216 12:25:04.959945 2858 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Dec 16 12:25:04.967941 kubelet[2858]: I1216 12:25:04.967909 2858 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Dec 16 12:25:04.973135 kubelet[2858]: I1216 12:25:04.973105 2858 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Dec 16 12:25:04.973539 kubelet[2858]: I1216 12:25:04.973468 2858 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Dec 16 12:25:04.973721 kubelet[2858]: I1216 12:25:04.973533 2858 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4515-1-0-6-95bdd2e3e7","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Dec 16 12:25:04.973721 kubelet[2858]: I1216 12:25:04.973711 2858 topology_manager.go:138] "Creating topology manager with none policy" Dec 16 12:25:04.973721 kubelet[2858]: I1216 12:25:04.973725 2858 container_manager_linux.go:306] "Creating device plugin manager" Dec 16 12:25:04.973863 kubelet[2858]: I1216 12:25:04.973750 2858 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Dec 16 12:25:04.974575 kubelet[2858]: I1216 12:25:04.974527 2858 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:25:04.974731 kubelet[2858]: I1216 12:25:04.974716 2858 kubelet.go:475] "Attempting to sync node with API server" Dec 16 12:25:04.974731 kubelet[2858]: I1216 12:25:04.974734 2858 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Dec 16 12:25:04.974903 kubelet[2858]: I1216 12:25:04.974755 2858 kubelet.go:387] "Adding apiserver pod source" Dec 16 12:25:04.974903 kubelet[2858]: I1216 12:25:04.974791 2858 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Dec 16 12:25:04.977305 kubelet[2858]: I1216 12:25:04.976141 2858 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Dec 16 12:25:04.977998 kubelet[2858]: I1216 12:25:04.977976 2858 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Dec 16 12:25:04.978502 kubelet[2858]: I1216 12:25:04.978428 2858 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Dec 16 12:25:04.982924 kubelet[2858]: I1216 12:25:04.982737 2858 server.go:1262] "Started kubelet" Dec 16 12:25:04.984607 kubelet[2858]: I1216 12:25:04.984589 2858 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Dec 16 12:25:04.986881 kubelet[2858]: I1216 12:25:04.986844 2858 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Dec 16 12:25:04.992608 kubelet[2858]: I1216 12:25:04.991474 2858 volume_manager.go:313] "Starting Kubelet Volume Manager" Dec 16 12:25:05.002072 kubelet[2858]: I1216 12:25:04.992808 2858 server.go:310] "Adding debug handlers to kubelet server" Dec 16 12:25:05.006044 kubelet[2858]: I1216 12:25:04.992859 2858 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Dec 16 12:25:05.006242 kubelet[2858]: I1216 12:25:05.006223 2858 server_v1.go:49] "podresources" method="list" useActivePods=true Dec 16 12:25:05.006504 kubelet[2858]: I1216 12:25:05.006486 2858 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Dec 16 12:25:05.006610 kubelet[2858]: I1216 12:25:04.999634 2858 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Dec 16 12:25:05.006753 kubelet[2858]: I1216 12:25:04.994177 2858 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Dec 16 12:25:05.006819 kubelet[2858]: E1216 12:25:04.994414 2858 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4515-1-0-6-95bdd2e3e7\" not found" Dec 16 12:25:05.010937 kubelet[2858]: I1216 12:25:05.009160 2858 factory.go:223] Registration of the systemd container factory successfully Dec 16 12:25:05.011092 kubelet[2858]: I1216 12:25:05.011057 2858 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Dec 16 12:25:05.017339 kubelet[2858]: I1216 12:25:05.010725 2858 reconciler.go:29] "Reconciler: start to sync state" Dec 16 12:25:05.036851 kubelet[2858]: I1216 12:25:05.036801 2858 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Dec 16 12:25:05.039258 kubelet[2858]: I1216 12:25:05.039194 2858 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Dec 16 12:25:05.039258 kubelet[2858]: I1216 12:25:05.039233 2858 status_manager.go:244] "Starting to sync pod status with apiserver" Dec 16 12:25:05.039258 kubelet[2858]: I1216 12:25:05.039260 2858 kubelet.go:2427] "Starting kubelet main sync loop" Dec 16 12:25:05.039494 kubelet[2858]: E1216 12:25:05.039346 2858 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Dec 16 12:25:05.064821 kubelet[2858]: I1216 12:25:05.063777 2858 factory.go:223] Registration of the containerd container factory successfully Dec 16 12:25:05.089627 kubelet[2858]: E1216 12:25:05.089522 2858 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Dec 16 12:25:05.139933 kubelet[2858]: E1216 12:25:05.139628 2858 kubelet.go:2451] "Skipping pod synchronization" err="container runtime status check may not have completed yet" Dec 16 12:25:05.141291 kubelet[2858]: I1216 12:25:05.141068 2858 cpu_manager.go:221] "Starting CPU manager" policy="none" Dec 16 12:25:05.142004 kubelet[2858]: I1216 12:25:05.141976 2858 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Dec 16 12:25:05.142073 kubelet[2858]: I1216 12:25:05.142014 2858 state_mem.go:36] "Initialized new in-memory state store" Dec 16 12:25:05.142504 kubelet[2858]: I1216 12:25:05.142464 2858 state_mem.go:88] "Updated default CPUSet" cpuSet="" Dec 16 12:25:05.142504 kubelet[2858]: I1216 12:25:05.142485 2858 state_mem.go:96] "Updated CPUSet assignments" assignments={} Dec 16 12:25:05.142504 kubelet[2858]: I1216 12:25:05.142506 2858 policy_none.go:49] "None policy: Start" Dec 16 12:25:05.144014 kubelet[2858]: I1216 12:25:05.142573 2858 memory_manager.go:187] "Starting memorymanager" policy="None" Dec 16 12:25:05.144014 kubelet[2858]: I1216 12:25:05.142808 2858 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Dec 16 12:25:05.144014 kubelet[2858]: I1216 12:25:05.142961 2858 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Dec 16 12:25:05.144014 kubelet[2858]: I1216 12:25:05.142971 2858 policy_none.go:47] "Start" Dec 16 12:25:05.153558 kubelet[2858]: E1216 12:25:05.153509 2858 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Dec 16 12:25:05.153921 kubelet[2858]: I1216 12:25:05.153901 2858 eviction_manager.go:189] "Eviction manager: starting control loop" Dec 16 12:25:05.153982 kubelet[2858]: I1216 12:25:05.153917 2858 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Dec 16 12:25:05.155580 kubelet[2858]: I1216 12:25:05.154487 2858 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Dec 16 12:25:05.160547 kubelet[2858]: E1216 12:25:05.159012 2858 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Dec 16 12:25:05.271760 kubelet[2858]: I1216 12:25:05.271630 2858 kubelet_node_status.go:75] "Attempting to register node" node="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:05.288776 kubelet[2858]: I1216 12:25:05.288746 2858 kubelet_node_status.go:124] "Node was previously registered" node="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:05.289065 kubelet[2858]: I1216 12:25:05.289034 2858 kubelet_node_status.go:78] "Successfully registered node" node="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:05.341154 kubelet[2858]: I1216 12:25:05.341105 2858 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:05.342387 kubelet[2858]: I1216 12:25:05.341576 2858 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:05.342612 kubelet[2858]: I1216 12:25:05.342594 2858 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:05.356931 kubelet[2858]: E1216 12:25:05.356898 2858 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4515-1-0-6-95bdd2e3e7\" already exists" pod="kube-system/kube-apiserver-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:05.419689 kubelet[2858]: I1216 12:25:05.419492 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/f1239b6595f69f17e5b85ca50f062d71-k8s-certs\") pod \"kube-controller-manager-ci-4515-1-0-6-95bdd2e3e7\" (UID: \"f1239b6595f69f17e5b85ca50f062d71\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:05.419689 kubelet[2858]: I1216 12:25:05.419744 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/f1239b6595f69f17e5b85ca50f062d71-kubeconfig\") pod \"kube-controller-manager-ci-4515-1-0-6-95bdd2e3e7\" (UID: \"f1239b6595f69f17e5b85ca50f062d71\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:05.420179 kubelet[2858]: I1216 12:25:05.419779 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/f1239b6595f69f17e5b85ca50f062d71-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4515-1-0-6-95bdd2e3e7\" (UID: \"f1239b6595f69f17e5b85ca50f062d71\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:05.420179 kubelet[2858]: I1216 12:25:05.420125 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/8680ee0b60674190213cb19d68639c50-kubeconfig\") pod \"kube-scheduler-ci-4515-1-0-6-95bdd2e3e7\" (UID: \"8680ee0b60674190213cb19d68639c50\") " pod="kube-system/kube-scheduler-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:05.420179 kubelet[2858]: I1216 12:25:05.420154 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/2f466efe848e856ca7c9bf6e56d3ab93-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4515-1-0-6-95bdd2e3e7\" (UID: \"2f466efe848e856ca7c9bf6e56d3ab93\") " pod="kube-system/kube-apiserver-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:05.420520 kubelet[2858]: I1216 12:25:05.420488 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/2f466efe848e856ca7c9bf6e56d3ab93-ca-certs\") pod \"kube-apiserver-ci-4515-1-0-6-95bdd2e3e7\" (UID: \"2f466efe848e856ca7c9bf6e56d3ab93\") " pod="kube-system/kube-apiserver-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:05.420781 kubelet[2858]: I1216 12:25:05.420746 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/2f466efe848e856ca7c9bf6e56d3ab93-k8s-certs\") pod \"kube-apiserver-ci-4515-1-0-6-95bdd2e3e7\" (UID: \"2f466efe848e856ca7c9bf6e56d3ab93\") " pod="kube-system/kube-apiserver-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:05.420925 kubelet[2858]: I1216 12:25:05.420877 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/f1239b6595f69f17e5b85ca50f062d71-ca-certs\") pod \"kube-controller-manager-ci-4515-1-0-6-95bdd2e3e7\" (UID: \"f1239b6595f69f17e5b85ca50f062d71\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:05.421064 kubelet[2858]: I1216 12:25:05.421019 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/f1239b6595f69f17e5b85ca50f062d71-flexvolume-dir\") pod \"kube-controller-manager-ci-4515-1-0-6-95bdd2e3e7\" (UID: \"f1239b6595f69f17e5b85ca50f062d71\") " pod="kube-system/kube-controller-manager-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:05.981918 kubelet[2858]: I1216 12:25:05.981859 2858 apiserver.go:52] "Watching apiserver" Dec 16 12:25:06.007173 kubelet[2858]: I1216 12:25:06.007073 2858 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Dec 16 12:25:06.102251 kubelet[2858]: I1216 12:25:06.102157 2858 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4515-1-0-6-95bdd2e3e7" podStartSLOduration=1.1021378450000001 podStartE2EDuration="1.102137845s" podCreationTimestamp="2025-12-16 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:25:06.083687028 +0000 UTC m=+1.193192127" watchObservedRunningTime="2025-12-16 12:25:06.102137845 +0000 UTC m=+1.211642905" Dec 16 12:25:06.117580 kubelet[2858]: I1216 12:25:06.116998 2858 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:06.149352 kubelet[2858]: E1216 12:25:06.149312 2858 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4515-1-0-6-95bdd2e3e7\" already exists" pod="kube-system/kube-scheduler-ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:06.149754 kubelet[2858]: I1216 12:25:06.149507 2858 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4515-1-0-6-95bdd2e3e7" podStartSLOduration=1.149492847 podStartE2EDuration="1.149492847s" podCreationTimestamp="2025-12-16 12:25:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:25:06.102469359 +0000 UTC m=+1.211974419" watchObservedRunningTime="2025-12-16 12:25:06.149492847 +0000 UTC m=+1.258997907" Dec 16 12:25:06.188287 kubelet[2858]: I1216 12:25:06.187220 2858 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4515-1-0-6-95bdd2e3e7" podStartSLOduration=3.187202268 podStartE2EDuration="3.187202268s" podCreationTimestamp="2025-12-16 12:25:03 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:25:06.150246113 +0000 UTC m=+1.259751213" watchObservedRunningTime="2025-12-16 12:25:06.187202268 +0000 UTC m=+1.296707368" Dec 16 12:25:10.106221 kubelet[2858]: I1216 12:25:10.105956 2858 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Dec 16 12:25:10.107090 containerd[1611]: time="2025-12-16T12:25:10.106752579Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Dec 16 12:25:10.107560 kubelet[2858]: I1216 12:25:10.107146 2858 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Dec 16 12:25:10.535133 systemd[1]: Created slice kubepods-besteffort-pod58133928_0f11_4244_9616_015e5b44d793.slice - libcontainer container kubepods-besteffort-pod58133928_0f11_4244_9616_015e5b44d793.slice. Dec 16 12:25:10.558562 kubelet[2858]: I1216 12:25:10.558481 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/58133928-0f11-4244-9616-015e5b44d793-xtables-lock\") pod \"kube-proxy-vhhxf\" (UID: \"58133928-0f11-4244-9616-015e5b44d793\") " pod="kube-system/kube-proxy-vhhxf" Dec 16 12:25:10.558562 kubelet[2858]: I1216 12:25:10.558567 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/58133928-0f11-4244-9616-015e5b44d793-kube-proxy\") pod \"kube-proxy-vhhxf\" (UID: \"58133928-0f11-4244-9616-015e5b44d793\") " pod="kube-system/kube-proxy-vhhxf" Dec 16 12:25:10.558805 kubelet[2858]: I1216 12:25:10.558610 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/58133928-0f11-4244-9616-015e5b44d793-lib-modules\") pod \"kube-proxy-vhhxf\" (UID: \"58133928-0f11-4244-9616-015e5b44d793\") " pod="kube-system/kube-proxy-vhhxf" Dec 16 12:25:10.558805 kubelet[2858]: I1216 12:25:10.558646 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6t2j9\" (UniqueName: \"kubernetes.io/projected/58133928-0f11-4244-9616-015e5b44d793-kube-api-access-6t2j9\") pod \"kube-proxy-vhhxf\" (UID: \"58133928-0f11-4244-9616-015e5b44d793\") " pod="kube-system/kube-proxy-vhhxf" Dec 16 12:25:10.668969 kubelet[2858]: E1216 12:25:10.668564 2858 projected.go:291] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Dec 16 12:25:10.668969 kubelet[2858]: E1216 12:25:10.668594 2858 projected.go:196] Error preparing data for projected volume kube-api-access-6t2j9 for pod kube-system/kube-proxy-vhhxf: configmap "kube-root-ca.crt" not found Dec 16 12:25:10.669285 kubelet[2858]: E1216 12:25:10.669254 2858 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/58133928-0f11-4244-9616-015e5b44d793-kube-api-access-6t2j9 podName:58133928-0f11-4244-9616-015e5b44d793 nodeName:}" failed. No retries permitted until 2025-12-16 12:25:11.169190053 +0000 UTC m=+6.278695153 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-6t2j9" (UniqueName: "kubernetes.io/projected/58133928-0f11-4244-9616-015e5b44d793-kube-api-access-6t2j9") pod "kube-proxy-vhhxf" (UID: "58133928-0f11-4244-9616-015e5b44d793") : configmap "kube-root-ca.crt" not found Dec 16 12:25:11.261344 systemd[1]: Created slice kubepods-besteffort-podb6628ff0_5804_4c27_923f_6efee43b443e.slice - libcontainer container kubepods-besteffort-podb6628ff0_5804_4c27_923f_6efee43b443e.slice. Dec 16 12:25:11.364404 kubelet[2858]: I1216 12:25:11.364190 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9rdl5\" (UniqueName: \"kubernetes.io/projected/b6628ff0-5804-4c27-923f-6efee43b443e-kube-api-access-9rdl5\") pod \"tigera-operator-65cdcdfd6d-fxp7s\" (UID: \"b6628ff0-5804-4c27-923f-6efee43b443e\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-fxp7s" Dec 16 12:25:11.364729 kubelet[2858]: I1216 12:25:11.364453 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/b6628ff0-5804-4c27-923f-6efee43b443e-var-lib-calico\") pod \"tigera-operator-65cdcdfd6d-fxp7s\" (UID: \"b6628ff0-5804-4c27-923f-6efee43b443e\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-fxp7s" Dec 16 12:25:11.452329 containerd[1611]: time="2025-12-16T12:25:11.452167034Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-vhhxf,Uid:58133928-0f11-4244-9616-015e5b44d793,Namespace:kube-system,Attempt:0,}" Dec 16 12:25:11.482285 containerd[1611]: time="2025-12-16T12:25:11.482043248Z" level=info msg="connecting to shim 69f3dd09903118ca044c24993256e250f12b3279993fc75f0870a7c88a3e0a4e" address="unix:///run/containerd/s/9b7b7e41cff6ac78fe9272ec9209cf77c13f4b9a1be591247892770f62a26c7f" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:25:11.510633 systemd[1]: Started cri-containerd-69f3dd09903118ca044c24993256e250f12b3279993fc75f0870a7c88a3e0a4e.scope - libcontainer container 69f3dd09903118ca044c24993256e250f12b3279993fc75f0870a7c88a3e0a4e. Dec 16 12:25:11.521000 audit: BPF prog-id=131 op=LOAD Dec 16 12:25:11.523479 kernel: kauditd_printk_skb: 32 callbacks suppressed Dec 16 12:25:11.523514 kernel: audit: type=1334 audit(1765887911.521:431): prog-id=131 op=LOAD Dec 16 12:25:11.523000 audit: BPF prog-id=132 op=LOAD Dec 16 12:25:11.528528 kernel: audit: type=1334 audit(1765887911.523:432): prog-id=132 op=LOAD Dec 16 12:25:11.528587 kernel: audit: type=1300 audit(1765887911.523:432): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176180 a2=98 a3=0 items=0 ppid=2915 pid=2927 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:11.523000 audit[2927]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176180 a2=98 a3=0 items=0 ppid=2915 pid=2927 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:11.523000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639663364643039393033313138636130343463323439393332353665 Dec 16 12:25:11.531421 kernel: audit: type=1327 audit(1765887911.523:432): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639663364643039393033313138636130343463323439393332353665 Dec 16 12:25:11.523000 audit: BPF prog-id=132 op=UNLOAD Dec 16 12:25:11.532767 kernel: audit: type=1334 audit(1765887911.523:433): prog-id=132 op=UNLOAD Dec 16 12:25:11.523000 audit[2927]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2915 pid=2927 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:11.535295 kernel: audit: type=1300 audit(1765887911.523:433): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2915 pid=2927 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:11.523000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639663364643039393033313138636130343463323439393332353665 Dec 16 12:25:11.537782 kernel: audit: type=1327 audit(1765887911.523:433): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639663364643039393033313138636130343463323439393332353665 Dec 16 12:25:11.523000 audit: BPF prog-id=133 op=LOAD Dec 16 12:25:11.538648 kernel: audit: type=1334 audit(1765887911.523:434): prog-id=133 op=LOAD Dec 16 12:25:11.541324 kernel: audit: type=1300 audit(1765887911.523:434): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=2915 pid=2927 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:11.523000 audit[2927]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=2915 pid=2927 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:11.523000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639663364643039393033313138636130343463323439393332353665 Dec 16 12:25:11.543703 kernel: audit: type=1327 audit(1765887911.523:434): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639663364643039393033313138636130343463323439393332353665 Dec 16 12:25:11.524000 audit: BPF prog-id=134 op=LOAD Dec 16 12:25:11.524000 audit[2927]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=2915 pid=2927 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:11.524000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639663364643039393033313138636130343463323439393332353665 Dec 16 12:25:11.524000 audit: BPF prog-id=134 op=UNLOAD Dec 16 12:25:11.524000 audit[2927]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2915 pid=2927 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:11.524000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639663364643039393033313138636130343463323439393332353665 Dec 16 12:25:11.524000 audit: BPF prog-id=133 op=UNLOAD Dec 16 12:25:11.524000 audit[2927]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2915 pid=2927 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:11.524000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639663364643039393033313138636130343463323439393332353665 Dec 16 12:25:11.524000 audit: BPF prog-id=135 op=LOAD Dec 16 12:25:11.524000 audit[2927]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176648 a2=98 a3=0 items=0 ppid=2915 pid=2927 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:11.524000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3639663364643039393033313138636130343463323439393332353665 Dec 16 12:25:11.557193 containerd[1611]: time="2025-12-16T12:25:11.557150416Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-vhhxf,Uid:58133928-0f11-4244-9616-015e5b44d793,Namespace:kube-system,Attempt:0,} returns sandbox id \"69f3dd09903118ca044c24993256e250f12b3279993fc75f0870a7c88a3e0a4e\"" Dec 16 12:25:11.564500 containerd[1611]: time="2025-12-16T12:25:11.564317634Z" level=info msg="CreateContainer within sandbox \"69f3dd09903118ca044c24993256e250f12b3279993fc75f0870a7c88a3e0a4e\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Dec 16 12:25:11.568882 containerd[1611]: time="2025-12-16T12:25:11.568844169Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-fxp7s,Uid:b6628ff0-5804-4c27-923f-6efee43b443e,Namespace:tigera-operator,Attempt:0,}" Dec 16 12:25:11.580047 containerd[1611]: time="2025-12-16T12:25:11.580007930Z" level=info msg="Container 64849ce0d893dde7a6e1753f2843f9279cd82141072e9122e8b06c572314a368: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:25:11.592309 containerd[1611]: time="2025-12-16T12:25:11.591892481Z" level=info msg="CreateContainer within sandbox \"69f3dd09903118ca044c24993256e250f12b3279993fc75f0870a7c88a3e0a4e\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"64849ce0d893dde7a6e1753f2843f9279cd82141072e9122e8b06c572314a368\"" Dec 16 12:25:11.592773 containerd[1611]: time="2025-12-16T12:25:11.592724069Z" level=info msg="StartContainer for \"64849ce0d893dde7a6e1753f2843f9279cd82141072e9122e8b06c572314a368\"" Dec 16 12:25:11.595296 containerd[1611]: time="2025-12-16T12:25:11.594529683Z" level=info msg="connecting to shim 64849ce0d893dde7a6e1753f2843f9279cd82141072e9122e8b06c572314a368" address="unix:///run/containerd/s/9b7b7e41cff6ac78fe9272ec9209cf77c13f4b9a1be591247892770f62a26c7f" protocol=ttrpc version=3 Dec 16 12:25:11.602483 containerd[1611]: time="2025-12-16T12:25:11.602425890Z" level=info msg="connecting to shim 4790c9286cc762f633cc91ee6f1daae3d4e4deb53599248aaf709f39761df242" address="unix:///run/containerd/s/f4473d77e413f6d0fe866728d02757a2ecff442dabb5c562757ab0c5ac41e682" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:25:11.616615 systemd[1]: Started cri-containerd-64849ce0d893dde7a6e1753f2843f9279cd82141072e9122e8b06c572314a368.scope - libcontainer container 64849ce0d893dde7a6e1753f2843f9279cd82141072e9122e8b06c572314a368. Dec 16 12:25:11.634473 systemd[1]: Started cri-containerd-4790c9286cc762f633cc91ee6f1daae3d4e4deb53599248aaf709f39761df242.scope - libcontainer container 4790c9286cc762f633cc91ee6f1daae3d4e4deb53599248aaf709f39761df242. Dec 16 12:25:11.646000 audit: BPF prog-id=136 op=LOAD Dec 16 12:25:11.647000 audit: BPF prog-id=137 op=LOAD Dec 16 12:25:11.647000 audit[2986]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000220180 a2=98 a3=0 items=0 ppid=2964 pid=2986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:11.647000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3437393063393238366363373632663633336363393165653666316461 Dec 16 12:25:11.647000 audit: BPF prog-id=137 op=UNLOAD Dec 16 12:25:11.647000 audit[2986]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2964 pid=2986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:11.647000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3437393063393238366363373632663633336363393165653666316461 Dec 16 12:25:11.648000 audit: BPF prog-id=138 op=LOAD Dec 16 12:25:11.648000 audit[2986]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40002203e8 a2=98 a3=0 items=0 ppid=2964 pid=2986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:11.648000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3437393063393238366363373632663633336363393165653666316461 Dec 16 12:25:11.648000 audit: BPF prog-id=139 op=LOAD Dec 16 12:25:11.648000 audit[2986]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000220168 a2=98 a3=0 items=0 ppid=2964 pid=2986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:11.648000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3437393063393238366363373632663633336363393165653666316461 Dec 16 12:25:11.648000 audit: BPF prog-id=139 op=UNLOAD Dec 16 12:25:11.648000 audit[2986]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2964 pid=2986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:11.648000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3437393063393238366363373632663633336363393165653666316461 Dec 16 12:25:11.648000 audit: BPF prog-id=138 op=UNLOAD Dec 16 12:25:11.648000 audit[2986]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2964 pid=2986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:11.648000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3437393063393238366363373632663633336363393165653666316461 Dec 16 12:25:11.648000 audit: BPF prog-id=140 op=LOAD Dec 16 12:25:11.648000 audit[2986]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000220648 a2=98 a3=0 items=0 ppid=2964 pid=2986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:11.648000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3437393063393238366363373632663633336363393165653666316461 Dec 16 12:25:11.657000 audit: BPF prog-id=141 op=LOAD Dec 16 12:25:11.657000 audit[2958]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40002203e8 a2=98 a3=0 items=0 ppid=2915 pid=2958 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:11.657000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634383439636530643839336464653761366531373533663238343366 Dec 16 12:25:11.657000 audit: BPF prog-id=142 op=LOAD Dec 16 12:25:11.657000 audit[2958]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000220168 a2=98 a3=0 items=0 ppid=2915 pid=2958 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:11.657000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634383439636530643839336464653761366531373533663238343366 Dec 16 12:25:11.657000 audit: BPF prog-id=142 op=UNLOAD Dec 16 12:25:11.657000 audit[2958]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2915 pid=2958 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:11.657000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634383439636530643839336464653761366531373533663238343366 Dec 16 12:25:11.657000 audit: BPF prog-id=141 op=UNLOAD Dec 16 12:25:11.657000 audit[2958]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2915 pid=2958 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:11.657000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634383439636530643839336464653761366531373533663238343366 Dec 16 12:25:11.657000 audit: BPF prog-id=143 op=LOAD Dec 16 12:25:11.657000 audit[2958]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000220648 a2=98 a3=0 items=0 ppid=2915 pid=2958 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:11.657000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634383439636530643839336464653761366531373533663238343366 Dec 16 12:25:11.693721 containerd[1611]: time="2025-12-16T12:25:11.693524311Z" level=info msg="StartContainer for \"64849ce0d893dde7a6e1753f2843f9279cd82141072e9122e8b06c572314a368\" returns successfully" Dec 16 12:25:11.696139 containerd[1611]: time="2025-12-16T12:25:11.696034915Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-fxp7s,Uid:b6628ff0-5804-4c27-923f-6efee43b443e,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"4790c9286cc762f633cc91ee6f1daae3d4e4deb53599248aaf709f39761df242\"" Dec 16 12:25:11.698276 containerd[1611]: time="2025-12-16T12:25:11.698166524Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Dec 16 12:25:11.932000 audit[3065]: NETFILTER_CFG table=mangle:54 family=2 entries=1 op=nft_register_chain pid=3065 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:25:11.932000 audit[3065]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffcabdaef0 a2=0 a3=1 items=0 ppid=2994 pid=3065 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:11.932000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Dec 16 12:25:11.935000 audit[3068]: NETFILTER_CFG table=nat:55 family=2 entries=1 op=nft_register_chain pid=3068 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:25:11.935000 audit[3068]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffff5bb1890 a2=0 a3=1 items=0 ppid=2994 pid=3068 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:11.935000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Dec 16 12:25:11.935000 audit[3066]: NETFILTER_CFG table=mangle:56 family=10 entries=1 op=nft_register_chain pid=3066 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:25:11.935000 audit[3066]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff7aad0d0 a2=0 a3=1 items=0 ppid=2994 pid=3066 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:11.935000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Dec 16 12:25:11.940000 audit[3071]: NETFILTER_CFG table=nat:57 family=10 entries=1 op=nft_register_chain pid=3071 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:25:11.940000 audit[3071]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc0cbab60 a2=0 a3=1 items=0 ppid=2994 pid=3071 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:11.940000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Dec 16 12:25:11.943000 audit[3073]: NETFILTER_CFG table=filter:58 family=10 entries=1 op=nft_register_chain pid=3073 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:25:11.943000 audit[3073]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffda5d8020 a2=0 a3=1 items=0 ppid=2994 pid=3073 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:11.943000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Dec 16 12:25:11.944000 audit[3072]: NETFILTER_CFG table=filter:59 family=2 entries=1 op=nft_register_chain pid=3072 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:25:11.944000 audit[3072]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffffd506de0 a2=0 a3=1 items=0 ppid=2994 pid=3072 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:11.944000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Dec 16 12:25:12.037000 audit[3074]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3074 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:25:12.037000 audit[3074]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=ffffd7667050 a2=0 a3=1 items=0 ppid=2994 pid=3074 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.037000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Dec 16 12:25:12.041000 audit[3076]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3076 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:25:12.041000 audit[3076]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffcdf42d50 a2=0 a3=1 items=0 ppid=2994 pid=3076 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.041000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73002D Dec 16 12:25:12.048000 audit[3079]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3079 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:25:12.048000 audit[3079]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffc9f4f130 a2=0 a3=1 items=0 ppid=2994 pid=3079 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.048000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73 Dec 16 12:25:12.050000 audit[3080]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3080 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:25:12.050000 audit[3080]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd47bed50 a2=0 a3=1 items=0 ppid=2994 pid=3080 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.050000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Dec 16 12:25:12.053000 audit[3082]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3082 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:25:12.053000 audit[3082]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffe91aa010 a2=0 a3=1 items=0 ppid=2994 pid=3082 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.053000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Dec 16 12:25:12.055000 audit[3083]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3083 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:25:12.055000 audit[3083]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd4a1f710 a2=0 a3=1 items=0 ppid=2994 pid=3083 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.055000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D5345525649434553002D740066696C746572 Dec 16 12:25:12.059000 audit[3085]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3085 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:25:12.059000 audit[3085]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffd40fda90 a2=0 a3=1 items=0 ppid=2994 pid=3085 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.059000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:25:12.065000 audit[3088]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3088 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:25:12.065000 audit[3088]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffd27b0400 a2=0 a3=1 items=0 ppid=2994 pid=3088 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.065000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:25:12.067000 audit[3089]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3089 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:25:12.067000 audit[3089]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd4e2d3a0 a2=0 a3=1 items=0 ppid=2994 pid=3089 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.067000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D464F5257415244002D740066696C746572 Dec 16 12:25:12.070000 audit[3091]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3091 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:25:12.070000 audit[3091]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffe4be60b0 a2=0 a3=1 items=0 ppid=2994 pid=3091 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.070000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Dec 16 12:25:12.071000 audit[3092]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3092 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:25:12.071000 audit[3092]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffc39e52c0 a2=0 a3=1 items=0 ppid=2994 pid=3092 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.071000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Dec 16 12:25:12.075000 audit[3094]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3094 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:25:12.075000 audit[3094]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffff7cf6170 a2=0 a3=1 items=0 ppid=2994 pid=3094 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.075000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F5859 Dec 16 12:25:12.080000 audit[3097]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3097 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:25:12.080000 audit[3097]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffdf3ab550 a2=0 a3=1 items=0 ppid=2994 pid=3097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.080000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F58 Dec 16 12:25:12.086000 audit[3100]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3100 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:25:12.086000 audit[3100]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffffa5d0200 a2=0 a3=1 items=0 ppid=2994 pid=3100 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.086000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F Dec 16 12:25:12.088000 audit[3101]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3101 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:25:12.088000 audit[3101]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffdc1345c0 a2=0 a3=1 items=0 ppid=2994 pid=3101 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.088000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D5345525649434553002D74006E6174 Dec 16 12:25:12.091000 audit[3103]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3103 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:25:12.091000 audit[3103]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=ffffc74b6940 a2=0 a3=1 items=0 ppid=2994 pid=3103 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.091000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:25:12.095000 audit[3106]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3106 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:25:12.095000 audit[3106]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffedd4dd60 a2=0 a3=1 items=0 ppid=2994 pid=3106 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.095000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:25:12.097000 audit[3107]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3107 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:25:12.097000 audit[3107]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc5b68540 a2=0 a3=1 items=0 ppid=2994 pid=3107 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.097000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Dec 16 12:25:12.102000 audit[3109]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3109 subj=system_u:system_r:kernel_t:s0 comm="iptables" Dec 16 12:25:12.102000 audit[3109]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=532 a0=3 a1=ffffe9abc090 a2=0 a3=1 items=0 ppid=2994 pid=3109 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.102000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Dec 16 12:25:12.129000 audit[3115]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3115 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:12.129000 audit[3115]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffc1604550 a2=0 a3=1 items=0 ppid=2994 pid=3115 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.129000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:12.140000 audit[3115]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3115 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:12.140000 audit[3115]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5508 a0=3 a1=ffffc1604550 a2=0 a3=1 items=0 ppid=2994 pid=3115 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.140000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:12.147000 audit[3120]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3120 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:25:12.147000 audit[3120]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=ffffc4a776a0 a2=0 a3=1 items=0 ppid=2994 pid=3120 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.147000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Dec 16 12:25:12.153000 audit[3122]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3122 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:25:12.153000 audit[3122]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=836 a0=3 a1=ffffdca25420 a2=0 a3=1 items=0 ppid=2994 pid=3122 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.153000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73 Dec 16 12:25:12.161000 audit[3125]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3125 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:25:12.161000 audit[3125]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffd906ef10 a2=0 a3=1 items=0 ppid=2994 pid=3125 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.161000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C Dec 16 12:25:12.163000 audit[3126]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3126 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:25:12.163000 audit[3126]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc4e817a0 a2=0 a3=1 items=0 ppid=2994 pid=3126 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.163000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Dec 16 12:25:12.168000 audit[3128]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3128 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:25:12.168000 audit[3128]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffe5195010 a2=0 a3=1 items=0 ppid=2994 pid=3128 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.168000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Dec 16 12:25:12.170000 audit[3129]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3129 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:25:12.170000 audit[3129]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc5443ac0 a2=0 a3=1 items=0 ppid=2994 pid=3129 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.170000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D5345525649434553002D740066696C746572 Dec 16 12:25:12.173000 audit[3131]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3131 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:25:12.173000 audit[3131]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffe2b19ac0 a2=0 a3=1 items=0 ppid=2994 pid=3131 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.173000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:25:12.177000 audit[3134]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3134 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:25:12.177000 audit[3134]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=828 a0=3 a1=fffff31ff5f0 a2=0 a3=1 items=0 ppid=2994 pid=3134 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.177000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:25:12.179000 audit[3135]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3135 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:25:12.179000 audit[3135]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe2ff58c0 a2=0 a3=1 items=0 ppid=2994 pid=3135 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.179000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D464F5257415244002D740066696C746572 Dec 16 12:25:12.182000 audit[3137]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3137 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:25:12.182000 audit[3137]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffef683650 a2=0 a3=1 items=0 ppid=2994 pid=3137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.182000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Dec 16 12:25:12.185000 audit[3138]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3138 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:25:12.185000 audit[3138]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffcd0e7660 a2=0 a3=1 items=0 ppid=2994 pid=3138 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.185000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Dec 16 12:25:12.188000 audit[3140]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3140 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:25:12.188000 audit[3140]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffff26032b0 a2=0 a3=1 items=0 ppid=2994 pid=3140 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.188000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F58 Dec 16 12:25:12.192000 audit[3143]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3143 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:25:12.192000 audit[3143]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffe7104fb0 a2=0 a3=1 items=0 ppid=2994 pid=3143 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.192000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F Dec 16 12:25:12.197000 audit[3146]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3146 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:25:12.197000 audit[3146]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffdebd1830 a2=0 a3=1 items=0 ppid=2994 pid=3146 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.197000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D5052 Dec 16 12:25:12.199000 audit[3147]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3147 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:25:12.199000 audit[3147]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=fffffd8c7fd0 a2=0 a3=1 items=0 ppid=2994 pid=3147 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.199000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D5345525649434553002D74006E6174 Dec 16 12:25:12.203000 audit[3149]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3149 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:25:12.203000 audit[3149]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=ffffc3da41b0 a2=0 a3=1 items=0 ppid=2994 pid=3149 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.203000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:25:12.208000 audit[3152]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3152 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:25:12.208000 audit[3152]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffd8a4f630 a2=0 a3=1 items=0 ppid=2994 pid=3152 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.208000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Dec 16 12:25:12.209000 audit[3153]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3153 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:25:12.209000 audit[3153]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe4f6a930 a2=0 a3=1 items=0 ppid=2994 pid=3153 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.209000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Dec 16 12:25:12.212000 audit[3155]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3155 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:25:12.212000 audit[3155]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=612 a0=3 a1=ffffe14dfe80 a2=0 a3=1 items=0 ppid=2994 pid=3155 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.212000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Dec 16 12:25:12.214000 audit[3156]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3156 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:25:12.214000 audit[3156]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd2bdaab0 a2=0 a3=1 items=0 ppid=2994 pid=3156 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.214000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4649524557414C4C002D740066696C746572 Dec 16 12:25:12.216000 audit[3158]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3158 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:25:12.216000 audit[3158]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=fffffbaee0f0 a2=0 a3=1 items=0 ppid=2994 pid=3158 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.216000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:25:12.221000 audit[3161]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3161 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Dec 16 12:25:12.221000 audit[3161]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffff458d00 a2=0 a3=1 items=0 ppid=2994 pid=3161 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.221000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Dec 16 12:25:12.225000 audit[3163]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3163 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Dec 16 12:25:12.225000 audit[3163]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2088 a0=3 a1=ffffe9daa770 a2=0 a3=1 items=0 ppid=2994 pid=3163 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.225000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:12.225000 audit[3163]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3163 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Dec 16 12:25:12.225000 audit[3163]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2056 a0=3 a1=ffffe9daa770 a2=0 a3=1 items=0 ppid=2994 pid=3163 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:12.225000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:12.276368 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3603998320.mount: Deactivated successfully. Dec 16 12:25:12.435537 kubelet[2858]: I1216 12:25:12.434860 2858 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-vhhxf" podStartSLOduration=2.434838558 podStartE2EDuration="2.434838558s" podCreationTimestamp="2025-12-16 12:25:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:25:12.152082674 +0000 UTC m=+7.261587774" watchObservedRunningTime="2025-12-16 12:25:12.434838558 +0000 UTC m=+7.544343698" Dec 16 12:25:14.133948 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1814918698.mount: Deactivated successfully. Dec 16 12:25:14.677495 containerd[1611]: time="2025-12-16T12:25:14.677444775Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:25:14.679866 containerd[1611]: time="2025-12-16T12:25:14.679798986Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=20773434" Dec 16 12:25:14.681583 containerd[1611]: time="2025-12-16T12:25:14.681510045Z" level=info msg="ImageCreate event name:\"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:25:14.684138 containerd[1611]: time="2025-12-16T12:25:14.684049574Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:25:14.685236 containerd[1611]: time="2025-12-16T12:25:14.684749086Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"22147999\" in 2.986495163s" Dec 16 12:25:14.685236 containerd[1611]: time="2025-12-16T12:25:14.684787325Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\"" Dec 16 12:25:14.689930 containerd[1611]: time="2025-12-16T12:25:14.689889942Z" level=info msg="CreateContainer within sandbox \"4790c9286cc762f633cc91ee6f1daae3d4e4deb53599248aaf709f39761df242\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Dec 16 12:25:14.700548 containerd[1611]: time="2025-12-16T12:25:14.699863740Z" level=info msg="Container 79d50d2ae3bca3a380216f32705573c6e226ff7721d293c8a612300c4040a86a: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:25:14.714515 containerd[1611]: time="2025-12-16T12:25:14.714437721Z" level=info msg="CreateContainer within sandbox \"4790c9286cc762f633cc91ee6f1daae3d4e4deb53599248aaf709f39761df242\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"79d50d2ae3bca3a380216f32705573c6e226ff7721d293c8a612300c4040a86a\"" Dec 16 12:25:14.716612 containerd[1611]: time="2025-12-16T12:25:14.716558095Z" level=info msg="StartContainer for \"79d50d2ae3bca3a380216f32705573c6e226ff7721d293c8a612300c4040a86a\"" Dec 16 12:25:14.719652 containerd[1611]: time="2025-12-16T12:25:14.719561778Z" level=info msg="connecting to shim 79d50d2ae3bca3a380216f32705573c6e226ff7721d293c8a612300c4040a86a" address="unix:///run/containerd/s/f4473d77e413f6d0fe866728d02757a2ecff442dabb5c562757ab0c5ac41e682" protocol=ttrpc version=3 Dec 16 12:25:14.744919 systemd[1]: Started cri-containerd-79d50d2ae3bca3a380216f32705573c6e226ff7721d293c8a612300c4040a86a.scope - libcontainer container 79d50d2ae3bca3a380216f32705573c6e226ff7721d293c8a612300c4040a86a. Dec 16 12:25:14.757000 audit: BPF prog-id=144 op=LOAD Dec 16 12:25:14.758000 audit: BPF prog-id=145 op=LOAD Dec 16 12:25:14.758000 audit[3172]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2964 pid=3172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:14.758000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739643530643261653362636133613338303231366633323730353537 Dec 16 12:25:14.758000 audit: BPF prog-id=145 op=UNLOAD Dec 16 12:25:14.758000 audit[3172]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2964 pid=3172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:14.758000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739643530643261653362636133613338303231366633323730353537 Dec 16 12:25:14.758000 audit: BPF prog-id=146 op=LOAD Dec 16 12:25:14.758000 audit[3172]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2964 pid=3172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:14.758000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739643530643261653362636133613338303231366633323730353537 Dec 16 12:25:14.758000 audit: BPF prog-id=147 op=LOAD Dec 16 12:25:14.758000 audit[3172]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2964 pid=3172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:14.758000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739643530643261653362636133613338303231366633323730353537 Dec 16 12:25:14.758000 audit: BPF prog-id=147 op=UNLOAD Dec 16 12:25:14.758000 audit[3172]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2964 pid=3172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:14.758000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739643530643261653362636133613338303231366633323730353537 Dec 16 12:25:14.759000 audit: BPF prog-id=146 op=UNLOAD Dec 16 12:25:14.759000 audit[3172]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2964 pid=3172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:14.759000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739643530643261653362636133613338303231366633323730353537 Dec 16 12:25:14.759000 audit: BPF prog-id=148 op=LOAD Dec 16 12:25:14.759000 audit[3172]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2964 pid=3172 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:14.759000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739643530643261653362636133613338303231366633323730353537 Dec 16 12:25:14.782942 containerd[1611]: time="2025-12-16T12:25:14.781810093Z" level=info msg="StartContainer for \"79d50d2ae3bca3a380216f32705573c6e226ff7721d293c8a612300c4040a86a\" returns successfully" Dec 16 12:25:15.744825 kubelet[2858]: I1216 12:25:15.744204 2858 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-65cdcdfd6d-fxp7s" podStartSLOduration=1.7554349999999999 podStartE2EDuration="4.744182653s" podCreationTimestamp="2025-12-16 12:25:11 +0000 UTC" firstStartedPulling="2025-12-16 12:25:11.697558093 +0000 UTC m=+6.807063193" lastFinishedPulling="2025-12-16 12:25:14.686305786 +0000 UTC m=+9.795810846" observedRunningTime="2025-12-16 12:25:15.16216611 +0000 UTC m=+10.271671250" watchObservedRunningTime="2025-12-16 12:25:15.744182653 +0000 UTC m=+10.853687793" Dec 16 12:25:20.997493 sudo[1925]: pam_unix(sudo:session): session closed for user root Dec 16 12:25:21.000446 kernel: kauditd_printk_skb: 224 callbacks suppressed Dec 16 12:25:21.000622 kernel: audit: type=1106 audit(1765887920.996:511): pid=1925 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:25:20.996000 audit[1925]: USER_END pid=1925 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_unix,pam_permit,pam_systemd acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:25:20.996000 audit[1925]: CRED_DISP pid=1925 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:25:21.003856 kernel: audit: type=1104 audit(1765887920.996:512): pid=1925 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Dec 16 12:25:21.172277 sshd[1902]: Connection closed by 139.178.89.65 port 38508 Dec 16 12:25:21.172848 sshd-session[1899]: pam_unix(sshd:session): session closed for user core Dec 16 12:25:21.176000 audit[1899]: USER_END pid=1899 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:25:21.176000 audit[1899]: CRED_DISP pid=1899 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:25:21.184067 kernel: audit: type=1106 audit(1765887921.176:513): pid=1899 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:25:21.184143 kernel: audit: type=1104 audit(1765887921.176:514): pid=1899 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:25:21.184687 systemd[1]: sshd@6-128.140.49.38:22-139.178.89.65:38508.service: Deactivated successfully. Dec 16 12:25:21.183000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-128.140.49.38:22-139.178.89.65:38508 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:25:21.188587 kernel: audit: type=1131 audit(1765887921.183:515): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-128.140.49.38:22-139.178.89.65:38508 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:25:21.191596 systemd[1]: session-7.scope: Deactivated successfully. Dec 16 12:25:21.192831 systemd[1]: session-7.scope: Consumed 7.131s CPU time, 222.2M memory peak. Dec 16 12:25:21.199427 systemd-logind[1580]: Session 7 logged out. Waiting for processes to exit. Dec 16 12:25:21.204805 systemd-logind[1580]: Removed session 7. Dec 16 12:25:25.022000 audit[3252]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3252 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:25.022000 audit[3252]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=fffff6d3d010 a2=0 a3=1 items=0 ppid=2994 pid=3252 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:25.028681 kernel: audit: type=1325 audit(1765887925.022:516): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3252 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:25.028784 kernel: audit: type=1300 audit(1765887925.022:516): arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=fffff6d3d010 a2=0 a3=1 items=0 ppid=2994 pid=3252 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:25.022000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:25.030456 kernel: audit: type=1327 audit(1765887925.022:516): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:25.028000 audit[3252]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3252 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:25.032571 kernel: audit: type=1325 audit(1765887925.028:517): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3252 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:25.028000 audit[3252]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff6d3d010 a2=0 a3=1 items=0 ppid=2994 pid=3252 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:25.035734 kernel: audit: type=1300 audit(1765887925.028:517): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffff6d3d010 a2=0 a3=1 items=0 ppid=2994 pid=3252 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:25.028000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:26.048000 audit[3254]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3254 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:26.050629 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:25:26.051101 kernel: audit: type=1325 audit(1765887926.048:518): table=filter:107 family=2 entries=16 op=nft_register_rule pid=3254 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:26.048000 audit[3254]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffe94de3c0 a2=0 a3=1 items=0 ppid=2994 pid=3254 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:26.054526 kernel: audit: type=1300 audit(1765887926.048:518): arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffe94de3c0 a2=0 a3=1 items=0 ppid=2994 pid=3254 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:26.048000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:26.056329 kernel: audit: type=1327 audit(1765887926.048:518): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:26.055000 audit[3254]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3254 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:26.058061 kernel: audit: type=1325 audit(1765887926.055:519): table=nat:108 family=2 entries=12 op=nft_register_rule pid=3254 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:26.055000 audit[3254]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffe94de3c0 a2=0 a3=1 items=0 ppid=2994 pid=3254 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:26.055000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:26.067864 kernel: audit: type=1300 audit(1765887926.055:519): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffe94de3c0 a2=0 a3=1 items=0 ppid=2994 pid=3254 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:26.067951 kernel: audit: type=1327 audit(1765887926.055:519): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:29.182000 audit[3258]: NETFILTER_CFG table=filter:109 family=2 entries=17 op=nft_register_rule pid=3258 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:29.182000 audit[3258]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=ffffdcbbe8b0 a2=0 a3=1 items=0 ppid=2994 pid=3258 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:29.188766 kernel: audit: type=1325 audit(1765887929.182:520): table=filter:109 family=2 entries=17 op=nft_register_rule pid=3258 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:29.188880 kernel: audit: type=1300 audit(1765887929.182:520): arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=ffffdcbbe8b0 a2=0 a3=1 items=0 ppid=2994 pid=3258 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:29.182000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:29.192016 kernel: audit: type=1327 audit(1765887929.182:520): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:29.189000 audit[3258]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3258 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:29.193363 kernel: audit: type=1325 audit(1765887929.189:521): table=nat:110 family=2 entries=12 op=nft_register_rule pid=3258 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:29.189000 audit[3258]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffdcbbe8b0 a2=0 a3=1 items=0 ppid=2994 pid=3258 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:29.189000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:30.237000 audit[3260]: NETFILTER_CFG table=filter:111 family=2 entries=18 op=nft_register_rule pid=3260 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:30.237000 audit[3260]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=ffffca7f5dd0 a2=0 a3=1 items=0 ppid=2994 pid=3260 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:30.237000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:30.244000 audit[3260]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3260 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:30.244000 audit[3260]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffca7f5dd0 a2=0 a3=1 items=0 ppid=2994 pid=3260 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:30.244000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:31.272000 audit[3262]: NETFILTER_CFG table=filter:113 family=2 entries=19 op=nft_register_rule pid=3262 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:31.275030 kernel: kauditd_printk_skb: 8 callbacks suppressed Dec 16 12:25:31.275140 kernel: audit: type=1325 audit(1765887931.272:524): table=filter:113 family=2 entries=19 op=nft_register_rule pid=3262 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:31.275181 kernel: audit: type=1300 audit(1765887931.272:524): arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffecd9c590 a2=0 a3=1 items=0 ppid=2994 pid=3262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:31.272000 audit[3262]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffecd9c590 a2=0 a3=1 items=0 ppid=2994 pid=3262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:31.272000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:31.278712 kernel: audit: type=1327 audit(1765887931.272:524): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:31.278000 audit[3262]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3262 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:31.278000 audit[3262]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffecd9c590 a2=0 a3=1 items=0 ppid=2994 pid=3262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:31.286745 kernel: audit: type=1325 audit(1765887931.278:525): table=nat:114 family=2 entries=12 op=nft_register_rule pid=3262 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:31.286835 kernel: audit: type=1300 audit(1765887931.278:525): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffecd9c590 a2=0 a3=1 items=0 ppid=2994 pid=3262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:31.286860 kernel: audit: type=1327 audit(1765887931.278:525): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:31.278000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:32.017776 systemd[1]: Created slice kubepods-besteffort-pode3019562_2805_4a19_8df5_5724fe721c5c.slice - libcontainer container kubepods-besteffort-pode3019562_2805_4a19_8df5_5724fe721c5c.slice. Dec 16 12:25:32.095954 kubelet[2858]: I1216 12:25:32.095832 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/e3019562-2805-4a19-8df5-5724fe721c5c-typha-certs\") pod \"calico-typha-fffbc976-rm9p9\" (UID: \"e3019562-2805-4a19-8df5-5724fe721c5c\") " pod="calico-system/calico-typha-fffbc976-rm9p9" Dec 16 12:25:32.095954 kubelet[2858]: I1216 12:25:32.095899 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e3019562-2805-4a19-8df5-5724fe721c5c-tigera-ca-bundle\") pod \"calico-typha-fffbc976-rm9p9\" (UID: \"e3019562-2805-4a19-8df5-5724fe721c5c\") " pod="calico-system/calico-typha-fffbc976-rm9p9" Dec 16 12:25:32.096659 kubelet[2858]: I1216 12:25:32.095931 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-54pzz\" (UniqueName: \"kubernetes.io/projected/e3019562-2805-4a19-8df5-5724fe721c5c-kube-api-access-54pzz\") pod \"calico-typha-fffbc976-rm9p9\" (UID: \"e3019562-2805-4a19-8df5-5724fe721c5c\") " pod="calico-system/calico-typha-fffbc976-rm9p9" Dec 16 12:25:32.166901 systemd[1]: Created slice kubepods-besteffort-podf9b5fcf4_28b0_4fb6_b6fb_fb2ee1997ff9.slice - libcontainer container kubepods-besteffort-podf9b5fcf4_28b0_4fb6_b6fb_fb2ee1997ff9.slice. Dec 16 12:25:32.197591 kubelet[2858]: I1216 12:25:32.197528 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/f9b5fcf4-28b0-4fb6-b6fb-fb2ee1997ff9-policysync\") pod \"calico-node-kgg6v\" (UID: \"f9b5fcf4-28b0-4fb6-b6fb-fb2ee1997ff9\") " pod="calico-system/calico-node-kgg6v" Dec 16 12:25:32.197591 kubelet[2858]: I1216 12:25:32.197596 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/f9b5fcf4-28b0-4fb6-b6fb-fb2ee1997ff9-var-run-calico\") pod \"calico-node-kgg6v\" (UID: \"f9b5fcf4-28b0-4fb6-b6fb-fb2ee1997ff9\") " pod="calico-system/calico-node-kgg6v" Dec 16 12:25:32.197808 kubelet[2858]: I1216 12:25:32.197619 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ljk8f\" (UniqueName: \"kubernetes.io/projected/f9b5fcf4-28b0-4fb6-b6fb-fb2ee1997ff9-kube-api-access-ljk8f\") pod \"calico-node-kgg6v\" (UID: \"f9b5fcf4-28b0-4fb6-b6fb-fb2ee1997ff9\") " pod="calico-system/calico-node-kgg6v" Dec 16 12:25:32.197808 kubelet[2858]: I1216 12:25:32.197661 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/f9b5fcf4-28b0-4fb6-b6fb-fb2ee1997ff9-var-lib-calico\") pod \"calico-node-kgg6v\" (UID: \"f9b5fcf4-28b0-4fb6-b6fb-fb2ee1997ff9\") " pod="calico-system/calico-node-kgg6v" Dec 16 12:25:32.197808 kubelet[2858]: I1216 12:25:32.197700 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/f9b5fcf4-28b0-4fb6-b6fb-fb2ee1997ff9-xtables-lock\") pod \"calico-node-kgg6v\" (UID: \"f9b5fcf4-28b0-4fb6-b6fb-fb2ee1997ff9\") " pod="calico-system/calico-node-kgg6v" Dec 16 12:25:32.197808 kubelet[2858]: I1216 12:25:32.197724 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/f9b5fcf4-28b0-4fb6-b6fb-fb2ee1997ff9-cni-log-dir\") pod \"calico-node-kgg6v\" (UID: \"f9b5fcf4-28b0-4fb6-b6fb-fb2ee1997ff9\") " pod="calico-system/calico-node-kgg6v" Dec 16 12:25:32.197808 kubelet[2858]: I1216 12:25:32.197743 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/f9b5fcf4-28b0-4fb6-b6fb-fb2ee1997ff9-node-certs\") pod \"calico-node-kgg6v\" (UID: \"f9b5fcf4-28b0-4fb6-b6fb-fb2ee1997ff9\") " pod="calico-system/calico-node-kgg6v" Dec 16 12:25:32.197968 kubelet[2858]: I1216 12:25:32.197775 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f9b5fcf4-28b0-4fb6-b6fb-fb2ee1997ff9-lib-modules\") pod \"calico-node-kgg6v\" (UID: \"f9b5fcf4-28b0-4fb6-b6fb-fb2ee1997ff9\") " pod="calico-system/calico-node-kgg6v" Dec 16 12:25:32.197968 kubelet[2858]: I1216 12:25:32.197795 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/f9b5fcf4-28b0-4fb6-b6fb-fb2ee1997ff9-cni-bin-dir\") pod \"calico-node-kgg6v\" (UID: \"f9b5fcf4-28b0-4fb6-b6fb-fb2ee1997ff9\") " pod="calico-system/calico-node-kgg6v" Dec 16 12:25:32.197968 kubelet[2858]: I1216 12:25:32.197815 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/f9b5fcf4-28b0-4fb6-b6fb-fb2ee1997ff9-cni-net-dir\") pod \"calico-node-kgg6v\" (UID: \"f9b5fcf4-28b0-4fb6-b6fb-fb2ee1997ff9\") " pod="calico-system/calico-node-kgg6v" Dec 16 12:25:32.197968 kubelet[2858]: I1216 12:25:32.197835 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/f9b5fcf4-28b0-4fb6-b6fb-fb2ee1997ff9-flexvol-driver-host\") pod \"calico-node-kgg6v\" (UID: \"f9b5fcf4-28b0-4fb6-b6fb-fb2ee1997ff9\") " pod="calico-system/calico-node-kgg6v" Dec 16 12:25:32.197968 kubelet[2858]: I1216 12:25:32.197856 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f9b5fcf4-28b0-4fb6-b6fb-fb2ee1997ff9-tigera-ca-bundle\") pod \"calico-node-kgg6v\" (UID: \"f9b5fcf4-28b0-4fb6-b6fb-fb2ee1997ff9\") " pod="calico-system/calico-node-kgg6v" Dec 16 12:25:32.295000 audit[3266]: NETFILTER_CFG table=filter:115 family=2 entries=21 op=nft_register_rule pid=3266 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:32.300345 kernel: audit: type=1325 audit(1765887932.295:526): table=filter:115 family=2 entries=21 op=nft_register_rule pid=3266 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:32.295000 audit[3266]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffda10e8f0 a2=0 a3=1 items=0 ppid=2994 pid=3266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:32.305685 kubelet[2858]: E1216 12:25:32.305341 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.305685 kubelet[2858]: W1216 12:25:32.305641 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.306155 kubelet[2858]: E1216 12:25:32.306088 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.295000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:32.309288 kernel: audit: type=1300 audit(1765887932.295:526): arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffda10e8f0 a2=0 a3=1 items=0 ppid=2994 pid=3266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:32.309379 kernel: audit: type=1327 audit(1765887932.295:526): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:32.309400 kernel: audit: type=1325 audit(1765887932.304:527): table=nat:116 family=2 entries=12 op=nft_register_rule pid=3266 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:32.304000 audit[3266]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3266 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:32.304000 audit[3266]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffda10e8f0 a2=0 a3=1 items=0 ppid=2994 pid=3266 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:32.304000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:32.311706 kubelet[2858]: E1216 12:25:32.311686 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.311946 kubelet[2858]: W1216 12:25:32.311772 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.311946 kubelet[2858]: E1216 12:25:32.311795 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.326799 containerd[1611]: time="2025-12-16T12:25:32.326746950Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-fffbc976-rm9p9,Uid:e3019562-2805-4a19-8df5-5724fe721c5c,Namespace:calico-system,Attempt:0,}" Dec 16 12:25:32.332044 kubelet[2858]: E1216 12:25:32.331735 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.332044 kubelet[2858]: W1216 12:25:32.331795 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.332044 kubelet[2858]: E1216 12:25:32.331821 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.367675 containerd[1611]: time="2025-12-16T12:25:32.367520148Z" level=info msg="connecting to shim b3ef2b802ebb23166a4c3863f0044767de494a93eafb5dca1299f88e420a939e" address="unix:///run/containerd/s/e3a2fadefce2b379537d7e0ca9a06e12348a2d9fc3b366369e7435f4de4f94b8" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:25:32.379049 kubelet[2858]: E1216 12:25:32.378114 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-psf64" podUID="3de277ef-70c9-4b08-8b83-d92a9680c7b8" Dec 16 12:25:32.379280 kubelet[2858]: E1216 12:25:32.379184 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.379326 kubelet[2858]: W1216 12:25:32.379285 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.379326 kubelet[2858]: E1216 12:25:32.379309 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.380089 kubelet[2858]: E1216 12:25:32.379992 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.380343 kubelet[2858]: W1216 12:25:32.380009 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.380547 kubelet[2858]: E1216 12:25:32.380262 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.381477 kubelet[2858]: E1216 12:25:32.381450 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.381477 kubelet[2858]: W1216 12:25:32.381468 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.383383 kubelet[2858]: E1216 12:25:32.381484 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.383613 kubelet[2858]: E1216 12:25:32.383595 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.383708 kubelet[2858]: W1216 12:25:32.383611 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.383708 kubelet[2858]: E1216 12:25:32.383654 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.383876 kubelet[2858]: E1216 12:25:32.383815 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.383876 kubelet[2858]: W1216 12:25:32.383827 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.383876 kubelet[2858]: E1216 12:25:32.383837 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.384435 kubelet[2858]: E1216 12:25:32.384347 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.384435 kubelet[2858]: W1216 12:25:32.384367 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.384435 kubelet[2858]: E1216 12:25:32.384381 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.384970 kubelet[2858]: E1216 12:25:32.384951 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.384970 kubelet[2858]: W1216 12:25:32.384966 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.384970 kubelet[2858]: E1216 12:25:32.384980 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.385785 kubelet[2858]: E1216 12:25:32.385131 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.385785 kubelet[2858]: W1216 12:25:32.385139 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.385785 kubelet[2858]: E1216 12:25:32.385147 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.385785 kubelet[2858]: E1216 12:25:32.385729 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.385785 kubelet[2858]: W1216 12:25:32.385744 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.385785 kubelet[2858]: E1216 12:25:32.385756 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.385979 kubelet[2858]: E1216 12:25:32.385958 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.385979 kubelet[2858]: W1216 12:25:32.385968 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.385979 kubelet[2858]: E1216 12:25:32.385978 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.386481 kubelet[2858]: E1216 12:25:32.386462 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.386481 kubelet[2858]: W1216 12:25:32.386477 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.386584 kubelet[2858]: E1216 12:25:32.386491 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.387263 kubelet[2858]: E1216 12:25:32.387234 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.387263 kubelet[2858]: W1216 12:25:32.387251 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.387364 kubelet[2858]: E1216 12:25:32.387264 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.388262 kubelet[2858]: E1216 12:25:32.388144 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.388262 kubelet[2858]: W1216 12:25:32.388162 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.388262 kubelet[2858]: E1216 12:25:32.388176 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.389988 kubelet[2858]: E1216 12:25:32.389948 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.389988 kubelet[2858]: W1216 12:25:32.389966 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.389988 kubelet[2858]: E1216 12:25:32.389980 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.390444 kubelet[2858]: E1216 12:25:32.390421 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.390444 kubelet[2858]: W1216 12:25:32.390437 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.390695 kubelet[2858]: E1216 12:25:32.390450 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.391505 kubelet[2858]: E1216 12:25:32.391421 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.391505 kubelet[2858]: W1216 12:25:32.391438 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.391505 kubelet[2858]: E1216 12:25:32.391454 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.391797 kubelet[2858]: E1216 12:25:32.391633 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.391797 kubelet[2858]: W1216 12:25:32.391643 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.391797 kubelet[2858]: E1216 12:25:32.391651 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.391797 kubelet[2858]: E1216 12:25:32.391775 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.391797 kubelet[2858]: W1216 12:25:32.391784 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.391797 kubelet[2858]: E1216 12:25:32.391792 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.392393 kubelet[2858]: E1216 12:25:32.392001 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.392393 kubelet[2858]: W1216 12:25:32.392013 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.392393 kubelet[2858]: E1216 12:25:32.392023 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.393454 kubelet[2858]: E1216 12:25:32.393373 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.393454 kubelet[2858]: W1216 12:25:32.393391 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.393454 kubelet[2858]: E1216 12:25:32.393410 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.400712 kubelet[2858]: E1216 12:25:32.400615 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.401030 kubelet[2858]: W1216 12:25:32.400853 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.401030 kubelet[2858]: E1216 12:25:32.400882 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.401030 kubelet[2858]: I1216 12:25:32.400917 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3de277ef-70c9-4b08-8b83-d92a9680c7b8-kubelet-dir\") pod \"csi-node-driver-psf64\" (UID: \"3de277ef-70c9-4b08-8b83-d92a9680c7b8\") " pod="calico-system/csi-node-driver-psf64" Dec 16 12:25:32.401585 kubelet[2858]: E1216 12:25:32.401569 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.401825 kubelet[2858]: W1216 12:25:32.401687 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.401825 kubelet[2858]: E1216 12:25:32.401709 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.401825 kubelet[2858]: I1216 12:25:32.401734 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/3de277ef-70c9-4b08-8b83-d92a9680c7b8-varrun\") pod \"csi-node-driver-psf64\" (UID: \"3de277ef-70c9-4b08-8b83-d92a9680c7b8\") " pod="calico-system/csi-node-driver-psf64" Dec 16 12:25:32.402012 kubelet[2858]: E1216 12:25:32.401996 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.402290 kubelet[2858]: W1216 12:25:32.402070 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.402290 kubelet[2858]: E1216 12:25:32.402087 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.402290 kubelet[2858]: I1216 12:25:32.402108 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3de277ef-70c9-4b08-8b83-d92a9680c7b8-registration-dir\") pod \"csi-node-driver-psf64\" (UID: \"3de277ef-70c9-4b08-8b83-d92a9680c7b8\") " pod="calico-system/csi-node-driver-psf64" Dec 16 12:25:32.402496 kubelet[2858]: E1216 12:25:32.402482 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.402553 kubelet[2858]: W1216 12:25:32.402542 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.402600 kubelet[2858]: E1216 12:25:32.402591 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.403345 kubelet[2858]: I1216 12:25:32.403192 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3de277ef-70c9-4b08-8b83-d92a9680c7b8-socket-dir\") pod \"csi-node-driver-psf64\" (UID: \"3de277ef-70c9-4b08-8b83-d92a9680c7b8\") " pod="calico-system/csi-node-driver-psf64" Dec 16 12:25:32.403561 kubelet[2858]: E1216 12:25:32.403549 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.403650 kubelet[2858]: W1216 12:25:32.403618 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.403712 kubelet[2858]: E1216 12:25:32.403701 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.403771 kubelet[2858]: I1216 12:25:32.403761 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcp6h\" (UniqueName: \"kubernetes.io/projected/3de277ef-70c9-4b08-8b83-d92a9680c7b8-kube-api-access-fcp6h\") pod \"csi-node-driver-psf64\" (UID: \"3de277ef-70c9-4b08-8b83-d92a9680c7b8\") " pod="calico-system/csi-node-driver-psf64" Dec 16 12:25:32.404041 kubelet[2858]: E1216 12:25:32.404000 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.404200 kubelet[2858]: W1216 12:25:32.404118 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.404200 kubelet[2858]: E1216 12:25:32.404137 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.404459 kubelet[2858]: E1216 12:25:32.404361 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.404459 kubelet[2858]: W1216 12:25:32.404372 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.404459 kubelet[2858]: E1216 12:25:32.404382 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.405287 kubelet[2858]: E1216 12:25:32.404630 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.405402 kubelet[2858]: W1216 12:25:32.405384 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.405490 kubelet[2858]: E1216 12:25:32.405466 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.407750 kubelet[2858]: E1216 12:25:32.407728 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.409577 kubelet[2858]: W1216 12:25:32.408549 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.409577 kubelet[2858]: E1216 12:25:32.408574 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.409577 kubelet[2858]: E1216 12:25:32.409430 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.409577 kubelet[2858]: W1216 12:25:32.409443 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.409577 kubelet[2858]: E1216 12:25:32.409457 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.409957 kubelet[2858]: E1216 12:25:32.409855 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.409957 kubelet[2858]: W1216 12:25:32.409870 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.409957 kubelet[2858]: E1216 12:25:32.409883 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.410133 kubelet[2858]: E1216 12:25:32.410122 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.410190 kubelet[2858]: W1216 12:25:32.410179 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.410264 kubelet[2858]: E1216 12:25:32.410252 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.410538 kubelet[2858]: E1216 12:25:32.410527 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.410712 kubelet[2858]: W1216 12:25:32.410593 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.410712 kubelet[2858]: E1216 12:25:32.410609 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.410943 kubelet[2858]: E1216 12:25:32.410930 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.412031 kubelet[2858]: W1216 12:25:32.412006 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.412231 kubelet[2858]: E1216 12:25:32.412117 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.412474 kubelet[2858]: E1216 12:25:32.412461 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.412553 kubelet[2858]: W1216 12:25:32.412540 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.412682 kubelet[2858]: E1216 12:25:32.412607 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.441542 systemd[1]: Started cri-containerd-b3ef2b802ebb23166a4c3863f0044767de494a93eafb5dca1299f88e420a939e.scope - libcontainer container b3ef2b802ebb23166a4c3863f0044767de494a93eafb5dca1299f88e420a939e. Dec 16 12:25:32.472835 containerd[1611]: time="2025-12-16T12:25:32.472544804Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-kgg6v,Uid:f9b5fcf4-28b0-4fb6-b6fb-fb2ee1997ff9,Namespace:calico-system,Attempt:0,}" Dec 16 12:25:32.479000 audit: BPF prog-id=149 op=LOAD Dec 16 12:25:32.481000 audit: BPF prog-id=150 op=LOAD Dec 16 12:25:32.481000 audit[3318]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=3280 pid=3318 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:32.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233656632623830326562623233313636613463333836336630303434 Dec 16 12:25:32.481000 audit: BPF prog-id=150 op=UNLOAD Dec 16 12:25:32.481000 audit[3318]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3280 pid=3318 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:32.481000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233656632623830326562623233313636613463333836336630303434 Dec 16 12:25:32.482000 audit: BPF prog-id=151 op=LOAD Dec 16 12:25:32.482000 audit[3318]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=3280 pid=3318 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:32.482000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233656632623830326562623233313636613463333836336630303434 Dec 16 12:25:32.482000 audit: BPF prog-id=152 op=LOAD Dec 16 12:25:32.482000 audit[3318]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a0168 a2=98 a3=0 items=0 ppid=3280 pid=3318 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:32.482000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233656632623830326562623233313636613463333836336630303434 Dec 16 12:25:32.482000 audit: BPF prog-id=152 op=UNLOAD Dec 16 12:25:32.482000 audit[3318]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3280 pid=3318 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:32.482000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233656632623830326562623233313636613463333836336630303434 Dec 16 12:25:32.482000 audit: BPF prog-id=151 op=UNLOAD Dec 16 12:25:32.482000 audit[3318]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3280 pid=3318 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:32.482000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233656632623830326562623233313636613463333836336630303434 Dec 16 12:25:32.482000 audit: BPF prog-id=153 op=LOAD Dec 16 12:25:32.482000 audit[3318]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0648 a2=98 a3=0 items=0 ppid=3280 pid=3318 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:32.482000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6233656632623830326562623233313636613463333836336630303434 Dec 16 12:25:32.506816 kubelet[2858]: E1216 12:25:32.506693 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.506816 kubelet[2858]: W1216 12:25:32.506738 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.506816 kubelet[2858]: E1216 12:25:32.506766 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.507158 kubelet[2858]: E1216 12:25:32.507114 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.507158 kubelet[2858]: W1216 12:25:32.507142 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.507158 kubelet[2858]: E1216 12:25:32.507155 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.508138 kubelet[2858]: E1216 12:25:32.508061 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.508138 kubelet[2858]: W1216 12:25:32.508078 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.508138 kubelet[2858]: E1216 12:25:32.508092 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.509584 kubelet[2858]: E1216 12:25:32.509560 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.509897 kubelet[2858]: W1216 12:25:32.509648 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.509897 kubelet[2858]: E1216 12:25:32.509679 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.510598 kubelet[2858]: E1216 12:25:32.510130 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.510598 kubelet[2858]: W1216 12:25:32.510149 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.510598 kubelet[2858]: E1216 12:25:32.510162 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.510865 kubelet[2858]: E1216 12:25:32.510841 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.510865 kubelet[2858]: W1216 12:25:32.510863 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.510966 kubelet[2858]: E1216 12:25:32.510877 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.511107 kubelet[2858]: E1216 12:25:32.511091 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.511107 kubelet[2858]: W1216 12:25:32.511105 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.511207 kubelet[2858]: E1216 12:25:32.511115 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.511638 containerd[1611]: time="2025-12-16T12:25:32.511586292Z" level=info msg="connecting to shim e1ca992b2858d3c4adfd22bed9d871162419badbfc1721630d137d8e97fb8640" address="unix:///run/containerd/s/153f2ddaea1a76f2570321eac10f98f292d61408632c786b28f66d4e6df78e5f" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:25:32.511990 kubelet[2858]: E1216 12:25:32.511782 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.511990 kubelet[2858]: W1216 12:25:32.511805 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.511990 kubelet[2858]: E1216 12:25:32.511818 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.511990 kubelet[2858]: E1216 12:25:32.511984 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.511990 kubelet[2858]: W1216 12:25:32.511993 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.513094 kubelet[2858]: E1216 12:25:32.512002 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.513094 kubelet[2858]: E1216 12:25:32.512926 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.513094 kubelet[2858]: W1216 12:25:32.512942 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.513094 kubelet[2858]: E1216 12:25:32.512955 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.513245 kubelet[2858]: E1216 12:25:32.513173 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.513245 kubelet[2858]: W1216 12:25:32.513182 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.513245 kubelet[2858]: E1216 12:25:32.513197 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.513466 kubelet[2858]: E1216 12:25:32.513356 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.513466 kubelet[2858]: W1216 12:25:32.513389 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.513466 kubelet[2858]: E1216 12:25:32.513399 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.514530 kubelet[2858]: E1216 12:25:32.514467 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.514530 kubelet[2858]: W1216 12:25:32.514530 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.514792 kubelet[2858]: E1216 12:25:32.514548 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.514792 kubelet[2858]: E1216 12:25:32.514731 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.514792 kubelet[2858]: W1216 12:25:32.514745 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.514792 kubelet[2858]: E1216 12:25:32.514755 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.515644 kubelet[2858]: E1216 12:25:32.514894 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.515644 kubelet[2858]: W1216 12:25:32.514903 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.515644 kubelet[2858]: E1216 12:25:32.514910 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.515644 kubelet[2858]: E1216 12:25:32.515050 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.515644 kubelet[2858]: W1216 12:25:32.515059 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.515644 kubelet[2858]: E1216 12:25:32.515067 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.515644 kubelet[2858]: E1216 12:25:32.515502 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.515644 kubelet[2858]: W1216 12:25:32.515519 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.515644 kubelet[2858]: E1216 12:25:32.515535 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.516524 kubelet[2858]: E1216 12:25:32.516448 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.516524 kubelet[2858]: W1216 12:25:32.516473 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.516524 kubelet[2858]: E1216 12:25:32.516489 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.516820 kubelet[2858]: E1216 12:25:32.516713 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.516820 kubelet[2858]: W1216 12:25:32.516724 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.516820 kubelet[2858]: E1216 12:25:32.516735 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.516897 kubelet[2858]: E1216 12:25:32.516884 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.516897 kubelet[2858]: W1216 12:25:32.516893 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.517838 kubelet[2858]: E1216 12:25:32.516901 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.517838 kubelet[2858]: E1216 12:25:32.517379 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.517838 kubelet[2858]: W1216 12:25:32.517393 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.517838 kubelet[2858]: E1216 12:25:32.517405 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.518386 kubelet[2858]: E1216 12:25:32.518348 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.518386 kubelet[2858]: W1216 12:25:32.518368 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.518386 kubelet[2858]: E1216 12:25:32.518382 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.519500 kubelet[2858]: E1216 12:25:32.519476 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.519500 kubelet[2858]: W1216 12:25:32.519493 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.519592 kubelet[2858]: E1216 12:25:32.519507 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.520276 kubelet[2858]: E1216 12:25:32.520239 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.520276 kubelet[2858]: W1216 12:25:32.520257 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.520562 kubelet[2858]: E1216 12:25:32.520409 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.522501 kubelet[2858]: E1216 12:25:32.522470 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.522501 kubelet[2858]: W1216 12:25:32.522495 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.522595 kubelet[2858]: E1216 12:25:32.522511 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.553355 kubelet[2858]: E1216 12:25:32.552552 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:32.553355 kubelet[2858]: W1216 12:25:32.552576 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:32.553355 kubelet[2858]: E1216 12:25:32.552596 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:32.556826 systemd[1]: Started cri-containerd-e1ca992b2858d3c4adfd22bed9d871162419badbfc1721630d137d8e97fb8640.scope - libcontainer container e1ca992b2858d3c4adfd22bed9d871162419badbfc1721630d137d8e97fb8640. Dec 16 12:25:32.585000 audit: BPF prog-id=154 op=LOAD Dec 16 12:25:32.586000 audit: BPF prog-id=155 op=LOAD Dec 16 12:25:32.586000 audit[3401]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176180 a2=98 a3=0 items=0 ppid=3365 pid=3401 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:32.586000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531636139393262323835386433633461646664323262656439643837 Dec 16 12:25:32.586000 audit: BPF prog-id=155 op=UNLOAD Dec 16 12:25:32.586000 audit[3401]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3365 pid=3401 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:32.586000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531636139393262323835386433633461646664323262656439643837 Dec 16 12:25:32.586000 audit: BPF prog-id=156 op=LOAD Dec 16 12:25:32.586000 audit[3401]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=3365 pid=3401 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:32.586000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531636139393262323835386433633461646664323262656439643837 Dec 16 12:25:32.587000 audit: BPF prog-id=157 op=LOAD Dec 16 12:25:32.587000 audit[3401]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=3365 pid=3401 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:32.587000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531636139393262323835386433633461646664323262656439643837 Dec 16 12:25:32.587000 audit: BPF prog-id=157 op=UNLOAD Dec 16 12:25:32.587000 audit[3401]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3365 pid=3401 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:32.587000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531636139393262323835386433633461646664323262656439643837 Dec 16 12:25:32.587000 audit: BPF prog-id=156 op=UNLOAD Dec 16 12:25:32.587000 audit[3401]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3365 pid=3401 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:32.587000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531636139393262323835386433633461646664323262656439643837 Dec 16 12:25:32.587000 audit: BPF prog-id=158 op=LOAD Dec 16 12:25:32.587000 audit[3401]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176648 a2=98 a3=0 items=0 ppid=3365 pid=3401 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:32.587000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6531636139393262323835386433633461646664323262656439643837 Dec 16 12:25:32.605252 containerd[1611]: time="2025-12-16T12:25:32.604340781Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-fffbc976-rm9p9,Uid:e3019562-2805-4a19-8df5-5724fe721c5c,Namespace:calico-system,Attempt:0,} returns sandbox id \"b3ef2b802ebb23166a4c3863f0044767de494a93eafb5dca1299f88e420a939e\"" Dec 16 12:25:32.608328 containerd[1611]: time="2025-12-16T12:25:32.608235478Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Dec 16 12:25:32.616038 containerd[1611]: time="2025-12-16T12:25:32.615947953Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-kgg6v,Uid:f9b5fcf4-28b0-4fb6-b6fb-fb2ee1997ff9,Namespace:calico-system,Attempt:0,} returns sandbox id \"e1ca992b2858d3c4adfd22bed9d871162419badbfc1721630d137d8e97fb8640\"" Dec 16 12:25:34.040452 kubelet[2858]: E1216 12:25:34.039545 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-psf64" podUID="3de277ef-70c9-4b08-8b83-d92a9680c7b8" Dec 16 12:25:34.124998 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1329325194.mount: Deactivated successfully. Dec 16 12:25:34.736807 containerd[1611]: time="2025-12-16T12:25:34.736759278Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:25:34.738455 containerd[1611]: time="2025-12-16T12:25:34.738398429Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=31716861" Dec 16 12:25:34.739316 containerd[1611]: time="2025-12-16T12:25:34.739147185Z" level=info msg="ImageCreate event name:\"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:25:34.741351 containerd[1611]: time="2025-12-16T12:25:34.741298773Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:25:34.742179 containerd[1611]: time="2025-12-16T12:25:34.742144208Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"33090541\" in 2.13385589s" Dec 16 12:25:34.742339 containerd[1611]: time="2025-12-16T12:25:34.742320527Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\"" Dec 16 12:25:34.745128 containerd[1611]: time="2025-12-16T12:25:34.745095072Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Dec 16 12:25:34.767628 containerd[1611]: time="2025-12-16T12:25:34.767550746Z" level=info msg="CreateContainer within sandbox \"b3ef2b802ebb23166a4c3863f0044767de494a93eafb5dca1299f88e420a939e\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Dec 16 12:25:34.779936 containerd[1611]: time="2025-12-16T12:25:34.779726478Z" level=info msg="Container a81d33f2cf36cc132f3bafec4098abf9eb32b7a0ed0265eacc2c926ce35ea669: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:25:34.784634 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3238389598.mount: Deactivated successfully. Dec 16 12:25:34.806433 containerd[1611]: time="2025-12-16T12:25:34.806069171Z" level=info msg="CreateContainer within sandbox \"b3ef2b802ebb23166a4c3863f0044767de494a93eafb5dca1299f88e420a939e\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"a81d33f2cf36cc132f3bafec4098abf9eb32b7a0ed0265eacc2c926ce35ea669\"" Dec 16 12:25:34.807657 containerd[1611]: time="2025-12-16T12:25:34.807593883Z" level=info msg="StartContainer for \"a81d33f2cf36cc132f3bafec4098abf9eb32b7a0ed0265eacc2c926ce35ea669\"" Dec 16 12:25:34.810013 containerd[1611]: time="2025-12-16T12:25:34.809955509Z" level=info msg="connecting to shim a81d33f2cf36cc132f3bafec4098abf9eb32b7a0ed0265eacc2c926ce35ea669" address="unix:///run/containerd/s/e3a2fadefce2b379537d7e0ca9a06e12348a2d9fc3b366369e7435f4de4f94b8" protocol=ttrpc version=3 Dec 16 12:25:34.839658 systemd[1]: Started cri-containerd-a81d33f2cf36cc132f3bafec4098abf9eb32b7a0ed0265eacc2c926ce35ea669.scope - libcontainer container a81d33f2cf36cc132f3bafec4098abf9eb32b7a0ed0265eacc2c926ce35ea669. Dec 16 12:25:34.862000 audit: BPF prog-id=159 op=LOAD Dec 16 12:25:34.862000 audit: BPF prog-id=160 op=LOAD Dec 16 12:25:34.862000 audit[3446]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=3280 pid=3446 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:34.862000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6138316433336632636633366363313332663362616665633430393861 Dec 16 12:25:34.862000 audit: BPF prog-id=160 op=UNLOAD Dec 16 12:25:34.862000 audit[3446]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3280 pid=3446 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:34.862000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6138316433336632636633366363313332663362616665633430393861 Dec 16 12:25:34.862000 audit: BPF prog-id=161 op=LOAD Dec 16 12:25:34.862000 audit[3446]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=3280 pid=3446 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:34.862000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6138316433336632636633366363313332663362616665633430393861 Dec 16 12:25:34.862000 audit: BPF prog-id=162 op=LOAD Dec 16 12:25:34.862000 audit[3446]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a0168 a2=98 a3=0 items=0 ppid=3280 pid=3446 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:34.862000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6138316433336632636633366363313332663362616665633430393861 Dec 16 12:25:34.862000 audit: BPF prog-id=162 op=UNLOAD Dec 16 12:25:34.862000 audit[3446]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3280 pid=3446 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:34.862000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6138316433336632636633366363313332663362616665633430393861 Dec 16 12:25:34.862000 audit: BPF prog-id=161 op=UNLOAD Dec 16 12:25:34.862000 audit[3446]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3280 pid=3446 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:34.862000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6138316433336632636633366363313332663362616665633430393861 Dec 16 12:25:34.862000 audit: BPF prog-id=163 op=LOAD Dec 16 12:25:34.862000 audit[3446]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0648 a2=98 a3=0 items=0 ppid=3280 pid=3446 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:34.862000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6138316433336632636633366363313332663362616665633430393861 Dec 16 12:25:34.901128 containerd[1611]: time="2025-12-16T12:25:34.901026840Z" level=info msg="StartContainer for \"a81d33f2cf36cc132f3bafec4098abf9eb32b7a0ed0265eacc2c926ce35ea669\" returns successfully" Dec 16 12:25:35.213955 kubelet[2858]: E1216 12:25:35.213899 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.213955 kubelet[2858]: W1216 12:25:35.213932 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.213955 kubelet[2858]: E1216 12:25:35.213953 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.215688 kubelet[2858]: E1216 12:25:35.215401 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.215688 kubelet[2858]: W1216 12:25:35.215428 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.215688 kubelet[2858]: E1216 12:25:35.215485 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.215959 kubelet[2858]: E1216 12:25:35.215714 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.215959 kubelet[2858]: W1216 12:25:35.215723 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.215959 kubelet[2858]: E1216 12:25:35.215744 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.216112 kubelet[2858]: E1216 12:25:35.216091 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.216112 kubelet[2858]: W1216 12:25:35.216107 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.216173 kubelet[2858]: E1216 12:25:35.216118 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.216318 kubelet[2858]: E1216 12:25:35.216300 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.216318 kubelet[2858]: W1216 12:25:35.216314 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.216368 kubelet[2858]: E1216 12:25:35.216323 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.217958 kubelet[2858]: E1216 12:25:35.217912 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.217958 kubelet[2858]: W1216 12:25:35.217935 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.217958 kubelet[2858]: E1216 12:25:35.217949 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.218105 kubelet[2858]: E1216 12:25:35.218098 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.218129 kubelet[2858]: W1216 12:25:35.218105 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.218129 kubelet[2858]: E1216 12:25:35.218113 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.218829 kubelet[2858]: E1216 12:25:35.218225 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.218829 kubelet[2858]: W1216 12:25:35.218238 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.218829 kubelet[2858]: E1216 12:25:35.218247 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.218829 kubelet[2858]: E1216 12:25:35.218469 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.218829 kubelet[2858]: W1216 12:25:35.218477 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.218829 kubelet[2858]: E1216 12:25:35.218486 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.218829 kubelet[2858]: E1216 12:25:35.218663 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.218829 kubelet[2858]: W1216 12:25:35.218672 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.218829 kubelet[2858]: E1216 12:25:35.218681 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.219141 kubelet[2858]: E1216 12:25:35.219114 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.219141 kubelet[2858]: W1216 12:25:35.219136 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.219197 kubelet[2858]: E1216 12:25:35.219147 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.219340 kubelet[2858]: E1216 12:25:35.219322 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.219427 kubelet[2858]: W1216 12:25:35.219345 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.219427 kubelet[2858]: E1216 12:25:35.219355 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.219539 kubelet[2858]: E1216 12:25:35.219520 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.219539 kubelet[2858]: W1216 12:25:35.219528 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.219539 kubelet[2858]: E1216 12:25:35.219536 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.220894 kubelet[2858]: E1216 12:25:35.220863 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.220894 kubelet[2858]: W1216 12:25:35.220885 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.220894 kubelet[2858]: E1216 12:25:35.220897 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.221076 kubelet[2858]: E1216 12:25:35.221056 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.221076 kubelet[2858]: W1216 12:25:35.221069 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.221122 kubelet[2858]: E1216 12:25:35.221078 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.231791 kubelet[2858]: E1216 12:25:35.231750 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.231791 kubelet[2858]: W1216 12:25:35.231777 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.231791 kubelet[2858]: E1216 12:25:35.231799 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.232097 kubelet[2858]: E1216 12:25:35.232082 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.232097 kubelet[2858]: W1216 12:25:35.232094 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.232150 kubelet[2858]: E1216 12:25:35.232104 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.232572 kubelet[2858]: E1216 12:25:35.232537 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.232572 kubelet[2858]: W1216 12:25:35.232569 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.232645 kubelet[2858]: E1216 12:25:35.232581 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.233515 kubelet[2858]: E1216 12:25:35.233490 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.233515 kubelet[2858]: W1216 12:25:35.233509 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.233515 kubelet[2858]: E1216 12:25:35.233521 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.233776 kubelet[2858]: E1216 12:25:35.233760 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.233776 kubelet[2858]: W1216 12:25:35.233773 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.233832 kubelet[2858]: E1216 12:25:35.233784 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.233987 kubelet[2858]: E1216 12:25:35.233972 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.233987 kubelet[2858]: W1216 12:25:35.233984 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.234052 kubelet[2858]: E1216 12:25:35.233993 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.234189 kubelet[2858]: E1216 12:25:35.234176 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.234189 kubelet[2858]: W1216 12:25:35.234187 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.234246 kubelet[2858]: E1216 12:25:35.234196 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.236518 kubelet[2858]: E1216 12:25:35.236495 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.236518 kubelet[2858]: W1216 12:25:35.236514 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.236611 kubelet[2858]: E1216 12:25:35.236528 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.237680 kubelet[2858]: E1216 12:25:35.237656 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.237680 kubelet[2858]: W1216 12:25:35.237673 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.237680 kubelet[2858]: E1216 12:25:35.237685 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.238090 kubelet[2858]: E1216 12:25:35.238068 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.238090 kubelet[2858]: W1216 12:25:35.238083 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.238232 kubelet[2858]: E1216 12:25:35.238095 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.238363 kubelet[2858]: E1216 12:25:35.238346 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.238363 kubelet[2858]: W1216 12:25:35.238361 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.238429 kubelet[2858]: E1216 12:25:35.238371 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.238588 kubelet[2858]: E1216 12:25:35.238571 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.238588 kubelet[2858]: W1216 12:25:35.238584 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.238664 kubelet[2858]: E1216 12:25:35.238594 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.239137 kubelet[2858]: E1216 12:25:35.239113 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.239137 kubelet[2858]: W1216 12:25:35.239130 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.239137 kubelet[2858]: E1216 12:25:35.239141 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.239429 kubelet[2858]: E1216 12:25:35.239401 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.239429 kubelet[2858]: W1216 12:25:35.239416 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.239429 kubelet[2858]: E1216 12:25:35.239427 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.239775 kubelet[2858]: E1216 12:25:35.239757 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.239974 kubelet[2858]: W1216 12:25:35.239843 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.239974 kubelet[2858]: E1216 12:25:35.239863 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.240128 kubelet[2858]: E1216 12:25:35.240118 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.240183 kubelet[2858]: W1216 12:25:35.240172 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.241308 kubelet[2858]: E1216 12:25:35.240222 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.241753 kubelet[2858]: E1216 12:25:35.241729 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.241753 kubelet[2858]: W1216 12:25:35.241746 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.241838 kubelet[2858]: E1216 12:25:35.241759 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:35.241981 kubelet[2858]: E1216 12:25:35.241967 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:35.241981 kubelet[2858]: W1216 12:25:35.241978 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:35.242032 kubelet[2858]: E1216 12:25:35.241988 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.040878 kubelet[2858]: E1216 12:25:36.040169 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-psf64" podUID="3de277ef-70c9-4b08-8b83-d92a9680c7b8" Dec 16 12:25:36.216494 kubelet[2858]: I1216 12:25:36.216459 2858 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 12:25:36.226668 kubelet[2858]: E1216 12:25:36.226615 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.226822 kubelet[2858]: W1216 12:25:36.226645 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.226822 kubelet[2858]: E1216 12:25:36.226703 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.227109 kubelet[2858]: E1216 12:25:36.227089 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.227109 kubelet[2858]: W1216 12:25:36.227104 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.227258 kubelet[2858]: E1216 12:25:36.227116 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.227391 kubelet[2858]: E1216 12:25:36.227376 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.227391 kubelet[2858]: W1216 12:25:36.227389 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.227520 kubelet[2858]: E1216 12:25:36.227399 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.227714 kubelet[2858]: E1216 12:25:36.227694 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.227714 kubelet[2858]: W1216 12:25:36.227713 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.227785 kubelet[2858]: E1216 12:25:36.227725 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.227939 kubelet[2858]: E1216 12:25:36.227925 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.227939 kubelet[2858]: W1216 12:25:36.227937 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.227998 kubelet[2858]: E1216 12:25:36.227947 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.228309 kubelet[2858]: E1216 12:25:36.228289 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.228309 kubelet[2858]: W1216 12:25:36.228303 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.228506 kubelet[2858]: E1216 12:25:36.228315 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.228709 kubelet[2858]: E1216 12:25:36.228689 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.228709 kubelet[2858]: W1216 12:25:36.228704 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.228908 kubelet[2858]: E1216 12:25:36.228715 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.229016 kubelet[2858]: E1216 12:25:36.229000 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.229016 kubelet[2858]: W1216 12:25:36.229011 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.229127 kubelet[2858]: E1216 12:25:36.229022 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.229422 kubelet[2858]: E1216 12:25:36.229405 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.229422 kubelet[2858]: W1216 12:25:36.229420 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.229625 kubelet[2858]: E1216 12:25:36.229431 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.229907 kubelet[2858]: E1216 12:25:36.229891 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.229907 kubelet[2858]: W1216 12:25:36.229905 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.230112 kubelet[2858]: E1216 12:25:36.230009 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.230332 kubelet[2858]: E1216 12:25:36.230317 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.230375 kubelet[2858]: W1216 12:25:36.230339 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.230375 kubelet[2858]: E1216 12:25:36.230350 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.230955 kubelet[2858]: E1216 12:25:36.230801 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.230955 kubelet[2858]: W1216 12:25:36.230819 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.230955 kubelet[2858]: E1216 12:25:36.230830 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.231235 kubelet[2858]: E1216 12:25:36.231219 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.231235 kubelet[2858]: W1216 12:25:36.231233 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.231332 kubelet[2858]: E1216 12:25:36.231244 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.231739 kubelet[2858]: E1216 12:25:36.231722 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.231739 kubelet[2858]: W1216 12:25:36.231738 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.231826 kubelet[2858]: E1216 12:25:36.231761 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.231996 kubelet[2858]: E1216 12:25:36.231976 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.232035 kubelet[2858]: W1216 12:25:36.231997 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.232035 kubelet[2858]: E1216 12:25:36.232008 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.241521 kubelet[2858]: E1216 12:25:36.241490 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.241916 kubelet[2858]: W1216 12:25:36.241702 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.241916 kubelet[2858]: E1216 12:25:36.241732 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.242237 kubelet[2858]: E1216 12:25:36.242212 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.242406 kubelet[2858]: W1216 12:25:36.242303 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.242406 kubelet[2858]: E1216 12:25:36.242322 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.242949 kubelet[2858]: E1216 12:25:36.242767 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.242949 kubelet[2858]: W1216 12:25:36.242943 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.243103 kubelet[2858]: E1216 12:25:36.242960 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.243388 kubelet[2858]: E1216 12:25:36.243356 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.243388 kubelet[2858]: W1216 12:25:36.243370 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.243388 kubelet[2858]: E1216 12:25:36.243385 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.243776 kubelet[2858]: E1216 12:25:36.243744 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.243776 kubelet[2858]: W1216 12:25:36.243762 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.243776 kubelet[2858]: E1216 12:25:36.243774 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.244464 kubelet[2858]: E1216 12:25:36.244336 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.244464 kubelet[2858]: W1216 12:25:36.244352 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.244464 kubelet[2858]: E1216 12:25:36.244364 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.244833 kubelet[2858]: E1216 12:25:36.244735 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.244833 kubelet[2858]: W1216 12:25:36.244780 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.244833 kubelet[2858]: E1216 12:25:36.244794 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.245216 kubelet[2858]: E1216 12:25:36.245199 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.245216 kubelet[2858]: W1216 12:25:36.245214 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.245322 kubelet[2858]: E1216 12:25:36.245227 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.246254 kubelet[2858]: E1216 12:25:36.246229 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.246398 kubelet[2858]: W1216 12:25:36.246340 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.246398 kubelet[2858]: E1216 12:25:36.246359 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.247024 kubelet[2858]: E1216 12:25:36.247003 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.247024 kubelet[2858]: W1216 12:25:36.247023 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.247122 kubelet[2858]: E1216 12:25:36.247039 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.247507 kubelet[2858]: E1216 12:25:36.247294 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.247507 kubelet[2858]: W1216 12:25:36.247313 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.247507 kubelet[2858]: E1216 12:25:36.247326 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.247779 kubelet[2858]: E1216 12:25:36.247762 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.247779 kubelet[2858]: W1216 12:25:36.247778 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.247834 kubelet[2858]: E1216 12:25:36.247791 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.248367 kubelet[2858]: E1216 12:25:36.248347 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.248367 kubelet[2858]: W1216 12:25:36.248365 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.248487 kubelet[2858]: E1216 12:25:36.248378 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.248766 kubelet[2858]: E1216 12:25:36.248733 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.248766 kubelet[2858]: W1216 12:25:36.248751 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.248766 kubelet[2858]: E1216 12:25:36.248764 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.249307 kubelet[2858]: E1216 12:25:36.249216 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.249593 kubelet[2858]: W1216 12:25:36.249400 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.249593 kubelet[2858]: E1216 12:25:36.249423 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.250390 kubelet[2858]: E1216 12:25:36.249970 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.250390 kubelet[2858]: W1216 12:25:36.249987 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.250390 kubelet[2858]: E1216 12:25:36.250004 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.250838 kubelet[2858]: E1216 12:25:36.250820 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.251003 kubelet[2858]: W1216 12:25:36.250987 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.251172 kubelet[2858]: E1216 12:25:36.251156 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.252656 kubelet[2858]: E1216 12:25:36.252626 2858 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Dec 16 12:25:36.252951 kubelet[2858]: W1216 12:25:36.252919 2858 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Dec 16 12:25:36.253054 kubelet[2858]: E1216 12:25:36.253006 2858 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Dec 16 12:25:36.285162 containerd[1611]: time="2025-12-16T12:25:36.285083714Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:25:36.286478 containerd[1611]: time="2025-12-16T12:25:36.286283548Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=0" Dec 16 12:25:36.287647 containerd[1611]: time="2025-12-16T12:25:36.287601861Z" level=info msg="ImageCreate event name:\"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:25:36.290496 containerd[1611]: time="2025-12-16T12:25:36.290435406Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:25:36.291401 containerd[1611]: time="2025-12-16T12:25:36.291093082Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5636392\" in 1.545876851s" Dec 16 12:25:36.291401 containerd[1611]: time="2025-12-16T12:25:36.291132682Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\"" Dec 16 12:25:36.298133 containerd[1611]: time="2025-12-16T12:25:36.298089205Z" level=info msg="CreateContainer within sandbox \"e1ca992b2858d3c4adfd22bed9d871162419badbfc1721630d137d8e97fb8640\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Dec 16 12:25:36.311495 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2014746963.mount: Deactivated successfully. Dec 16 12:25:36.311812 containerd[1611]: time="2025-12-16T12:25:36.311609054Z" level=info msg="Container 67d0f38fcb1970187da8cdd0cafd0cde532477ea1f629ae07c61877fcfd37fd6: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:25:36.322909 containerd[1611]: time="2025-12-16T12:25:36.322836755Z" level=info msg="CreateContainer within sandbox \"e1ca992b2858d3c4adfd22bed9d871162419badbfc1721630d137d8e97fb8640\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"67d0f38fcb1970187da8cdd0cafd0cde532477ea1f629ae07c61877fcfd37fd6\"" Dec 16 12:25:36.326291 containerd[1611]: time="2025-12-16T12:25:36.325458821Z" level=info msg="StartContainer for \"67d0f38fcb1970187da8cdd0cafd0cde532477ea1f629ae07c61877fcfd37fd6\"" Dec 16 12:25:36.327642 containerd[1611]: time="2025-12-16T12:25:36.327599809Z" level=info msg="connecting to shim 67d0f38fcb1970187da8cdd0cafd0cde532477ea1f629ae07c61877fcfd37fd6" address="unix:///run/containerd/s/153f2ddaea1a76f2570321eac10f98f292d61408632c786b28f66d4e6df78e5f" protocol=ttrpc version=3 Dec 16 12:25:36.354497 systemd[1]: Started cri-containerd-67d0f38fcb1970187da8cdd0cafd0cde532477ea1f629ae07c61877fcfd37fd6.scope - libcontainer container 67d0f38fcb1970187da8cdd0cafd0cde532477ea1f629ae07c61877fcfd37fd6. Dec 16 12:25:36.419305 kernel: kauditd_printk_skb: 68 callbacks suppressed Dec 16 12:25:36.419382 kernel: audit: type=1334 audit(1765887936.417:552): prog-id=164 op=LOAD Dec 16 12:25:36.417000 audit: BPF prog-id=164 op=LOAD Dec 16 12:25:36.417000 audit[3552]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=3365 pid=3552 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:36.417000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3637643066333866636231393730313837646138636464306361666430 Dec 16 12:25:36.425198 kernel: audit: type=1300 audit(1765887936.417:552): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=3365 pid=3552 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:36.425295 kernel: audit: type=1327 audit(1765887936.417:552): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3637643066333866636231393730313837646138636464306361666430 Dec 16 12:25:36.418000 audit: BPF prog-id=165 op=LOAD Dec 16 12:25:36.418000 audit[3552]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=3365 pid=3552 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:36.426345 kernel: audit: type=1334 audit(1765887936.418:553): prog-id=165 op=LOAD Dec 16 12:25:36.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3637643066333866636231393730313837646138636464306361666430 Dec 16 12:25:36.430786 kernel: audit: type=1300 audit(1765887936.418:553): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=3365 pid=3552 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:36.430844 kernel: audit: type=1327 audit(1765887936.418:553): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3637643066333866636231393730313837646138636464306361666430 Dec 16 12:25:36.418000 audit: BPF prog-id=165 op=UNLOAD Dec 16 12:25:36.418000 audit[3552]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3365 pid=3552 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:36.434983 kernel: audit: type=1334 audit(1765887936.418:554): prog-id=165 op=UNLOAD Dec 16 12:25:36.435013 kernel: audit: type=1300 audit(1765887936.418:554): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3365 pid=3552 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:36.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3637643066333866636231393730313837646138636464306361666430 Dec 16 12:25:36.437614 kernel: audit: type=1327 audit(1765887936.418:554): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3637643066333866636231393730313837646138636464306361666430 Dec 16 12:25:36.418000 audit: BPF prog-id=164 op=UNLOAD Dec 16 12:25:36.438375 kernel: audit: type=1334 audit(1765887936.418:555): prog-id=164 op=UNLOAD Dec 16 12:25:36.418000 audit[3552]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3365 pid=3552 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:36.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3637643066333866636231393730313837646138636464306361666430 Dec 16 12:25:36.418000 audit: BPF prog-id=166 op=LOAD Dec 16 12:25:36.418000 audit[3552]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000176648 a2=98 a3=0 items=0 ppid=3365 pid=3552 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:36.418000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3637643066333866636231393730313837646138636464306361666430 Dec 16 12:25:36.457167 containerd[1611]: time="2025-12-16T12:25:36.456397770Z" level=info msg="StartContainer for \"67d0f38fcb1970187da8cdd0cafd0cde532477ea1f629ae07c61877fcfd37fd6\" returns successfully" Dec 16 12:25:36.467507 systemd[1]: cri-containerd-67d0f38fcb1970187da8cdd0cafd0cde532477ea1f629ae07c61877fcfd37fd6.scope: Deactivated successfully. Dec 16 12:25:36.470000 audit: BPF prog-id=166 op=UNLOAD Dec 16 12:25:36.472232 containerd[1611]: time="2025-12-16T12:25:36.472124966Z" level=info msg="received container exit event container_id:\"67d0f38fcb1970187da8cdd0cafd0cde532477ea1f629ae07c61877fcfd37fd6\" id:\"67d0f38fcb1970187da8cdd0cafd0cde532477ea1f629ae07c61877fcfd37fd6\" pid:3563 exited_at:{seconds:1765887936 nanos:471203491}" Dec 16 12:25:36.496839 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-67d0f38fcb1970187da8cdd0cafd0cde532477ea1f629ae07c61877fcfd37fd6-rootfs.mount: Deactivated successfully. Dec 16 12:25:37.224054 containerd[1611]: time="2025-12-16T12:25:37.223931029Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Dec 16 12:25:37.246647 kubelet[2858]: I1216 12:25:37.246457 2858 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-fffbc976-rm9p9" podStartSLOduration=4.109511241 podStartE2EDuration="6.246438833s" podCreationTimestamp="2025-12-16 12:25:31 +0000 UTC" firstStartedPulling="2025-12-16 12:25:32.607681242 +0000 UTC m=+27.717186342" lastFinishedPulling="2025-12-16 12:25:34.744608754 +0000 UTC m=+29.854113934" observedRunningTime="2025-12-16 12:25:35.287406447 +0000 UTC m=+30.396911547" watchObservedRunningTime="2025-12-16 12:25:37.246438833 +0000 UTC m=+32.355943933" Dec 16 12:25:38.041148 kubelet[2858]: E1216 12:25:38.039869 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-psf64" podUID="3de277ef-70c9-4b08-8b83-d92a9680c7b8" Dec 16 12:25:40.040329 kubelet[2858]: E1216 12:25:40.040249 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-psf64" podUID="3de277ef-70c9-4b08-8b83-d92a9680c7b8" Dec 16 12:25:40.872128 containerd[1611]: time="2025-12-16T12:25:40.872071625Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:25:40.873263 containerd[1611]: time="2025-12-16T12:25:40.873200620Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=65921248" Dec 16 12:25:40.875022 containerd[1611]: time="2025-12-16T12:25:40.874073976Z" level=info msg="ImageCreate event name:\"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:25:40.876758 containerd[1611]: time="2025-12-16T12:25:40.876722803Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:25:40.878361 containerd[1611]: time="2025-12-16T12:25:40.878300596Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"67295507\" in 3.653988369s" Dec 16 12:25:40.878471 containerd[1611]: time="2025-12-16T12:25:40.878364595Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\"" Dec 16 12:25:40.886722 containerd[1611]: time="2025-12-16T12:25:40.886667116Z" level=info msg="CreateContainer within sandbox \"e1ca992b2858d3c4adfd22bed9d871162419badbfc1721630d137d8e97fb8640\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Dec 16 12:25:40.897306 containerd[1611]: time="2025-12-16T12:25:40.895933072Z" level=info msg="Container faf8383f94184cceea126ffe552992677df3e54583a4afaf4f86e778539fe09d: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:25:40.908427 containerd[1611]: time="2025-12-16T12:25:40.908361932Z" level=info msg="CreateContainer within sandbox \"e1ca992b2858d3c4adfd22bed9d871162419badbfc1721630d137d8e97fb8640\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"faf8383f94184cceea126ffe552992677df3e54583a4afaf4f86e778539fe09d\"" Dec 16 12:25:40.909850 containerd[1611]: time="2025-12-16T12:25:40.909710406Z" level=info msg="StartContainer for \"faf8383f94184cceea126ffe552992677df3e54583a4afaf4f86e778539fe09d\"" Dec 16 12:25:40.914420 containerd[1611]: time="2025-12-16T12:25:40.914367224Z" level=info msg="connecting to shim faf8383f94184cceea126ffe552992677df3e54583a4afaf4f86e778539fe09d" address="unix:///run/containerd/s/153f2ddaea1a76f2570321eac10f98f292d61408632c786b28f66d4e6df78e5f" protocol=ttrpc version=3 Dec 16 12:25:40.938566 systemd[1]: Started cri-containerd-faf8383f94184cceea126ffe552992677df3e54583a4afaf4f86e778539fe09d.scope - libcontainer container faf8383f94184cceea126ffe552992677df3e54583a4afaf4f86e778539fe09d. Dec 16 12:25:40.989000 audit: BPF prog-id=167 op=LOAD Dec 16 12:25:40.989000 audit[3613]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=3365 pid=3613 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:40.989000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661663833383366393431383463636565613132366666653535323939 Dec 16 12:25:40.990000 audit: BPF prog-id=168 op=LOAD Dec 16 12:25:40.990000 audit[3613]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=3365 pid=3613 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:40.990000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661663833383366393431383463636565613132366666653535323939 Dec 16 12:25:40.990000 audit: BPF prog-id=168 op=UNLOAD Dec 16 12:25:40.990000 audit[3613]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3365 pid=3613 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:40.990000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661663833383366393431383463636565613132366666653535323939 Dec 16 12:25:40.990000 audit: BPF prog-id=167 op=UNLOAD Dec 16 12:25:40.990000 audit[3613]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3365 pid=3613 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:40.990000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661663833383366393431383463636565613132366666653535323939 Dec 16 12:25:40.990000 audit: BPF prog-id=169 op=LOAD Dec 16 12:25:40.990000 audit[3613]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000176648 a2=98 a3=0 items=0 ppid=3365 pid=3613 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:40.990000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661663833383366393431383463636565613132366666653535323939 Dec 16 12:25:41.017305 containerd[1611]: time="2025-12-16T12:25:41.017007496Z" level=info msg="StartContainer for \"faf8383f94184cceea126ffe552992677df3e54583a4afaf4f86e778539fe09d\" returns successfully" Dec 16 12:25:41.519141 containerd[1611]: time="2025-12-16T12:25:41.519088475Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Dec 16 12:25:41.521871 systemd[1]: cri-containerd-faf8383f94184cceea126ffe552992677df3e54583a4afaf4f86e778539fe09d.scope: Deactivated successfully. Dec 16 12:25:41.522298 systemd[1]: cri-containerd-faf8383f94184cceea126ffe552992677df3e54583a4afaf4f86e778539fe09d.scope: Consumed 477ms CPU time, 187.2M memory peak, 165.9M written to disk. Dec 16 12:25:41.525000 audit: BPF prog-id=169 op=UNLOAD Dec 16 12:25:41.526741 kernel: kauditd_printk_skb: 21 callbacks suppressed Dec 16 12:25:41.526833 kernel: audit: type=1334 audit(1765887941.525:563): prog-id=169 op=UNLOAD Dec 16 12:25:41.528179 containerd[1611]: time="2025-12-16T12:25:41.528113913Z" level=info msg="received container exit event container_id:\"faf8383f94184cceea126ffe552992677df3e54583a4afaf4f86e778539fe09d\" id:\"faf8383f94184cceea126ffe552992677df3e54583a4afaf4f86e778539fe09d\" pid:3626 exited_at:{seconds:1765887941 nanos:527833394}" Dec 16 12:25:41.551042 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-faf8383f94184cceea126ffe552992677df3e54583a4afaf4f86e778539fe09d-rootfs.mount: Deactivated successfully. Dec 16 12:25:41.590289 kubelet[2858]: I1216 12:25:41.590244 2858 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Dec 16 12:25:41.655504 systemd[1]: Created slice kubepods-burstable-podb72f115e_557a_48f9_b3b7_382ad8af5dee.slice - libcontainer container kubepods-burstable-podb72f115e_557a_48f9_b3b7_382ad8af5dee.slice. Dec 16 12:25:41.668756 systemd[1]: Created slice kubepods-burstable-pod07aaeefc_07f0_4e68_bf3e_404493256fb6.slice - libcontainer container kubepods-burstable-pod07aaeefc_07f0_4e68_bf3e_404493256fb6.slice. Dec 16 12:25:41.681765 systemd[1]: Created slice kubepods-besteffort-pod6c115690_4c4b_4f0d_9563_71f7671c0428.slice - libcontainer container kubepods-besteffort-pod6c115690_4c4b_4f0d_9563_71f7671c0428.slice. Dec 16 12:25:41.692295 systemd[1]: Created slice kubepods-besteffort-pod2053ff95_314d_4312_973b_a00f2cf38258.slice - libcontainer container kubepods-besteffort-pod2053ff95_314d_4312_973b_a00f2cf38258.slice. Dec 16 12:25:41.693705 kubelet[2858]: I1216 12:25:41.693672 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/07aaeefc-07f0-4e68-bf3e-404493256fb6-config-volume\") pod \"coredns-66bc5c9577-9mpwx\" (UID: \"07aaeefc-07f0-4e68-bf3e-404493256fb6\") " pod="kube-system/coredns-66bc5c9577-9mpwx" Dec 16 12:25:41.693705 kubelet[2858]: I1216 12:25:41.693703 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tl96g\" (UniqueName: \"kubernetes.io/projected/1a05c39e-610b-4af0-af9e-5a096e11d6cf-kube-api-access-tl96g\") pod \"whisker-5c8d8449c7-5bxg5\" (UID: \"1a05c39e-610b-4af0-af9e-5a096e11d6cf\") " pod="calico-system/whisker-5c8d8449c7-5bxg5" Dec 16 12:25:41.693805 kubelet[2858]: I1216 12:25:41.693733 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e5eca681-3514-4805-908a-df03e7d148ad-goldmane-ca-bundle\") pod \"goldmane-7c778bb748-5kz8n\" (UID: \"e5eca681-3514-4805-908a-df03e7d148ad\") " pod="calico-system/goldmane-7c778bb748-5kz8n" Dec 16 12:25:41.693805 kubelet[2858]: I1216 12:25:41.693748 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ncn5k\" (UniqueName: \"kubernetes.io/projected/e5eca681-3514-4805-908a-df03e7d148ad-kube-api-access-ncn5k\") pod \"goldmane-7c778bb748-5kz8n\" (UID: \"e5eca681-3514-4805-908a-df03e7d148ad\") " pod="calico-system/goldmane-7c778bb748-5kz8n" Dec 16 12:25:41.693805 kubelet[2858]: I1216 12:25:41.693763 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/b72f115e-557a-48f9-b3b7-382ad8af5dee-config-volume\") pod \"coredns-66bc5c9577-snrff\" (UID: \"b72f115e-557a-48f9-b3b7-382ad8af5dee\") " pod="kube-system/coredns-66bc5c9577-snrff" Dec 16 12:25:41.693805 kubelet[2858]: I1216 12:25:41.693787 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9mwc\" (UniqueName: \"kubernetes.io/projected/6c115690-4c4b-4f0d-9563-71f7671c0428-kube-api-access-z9mwc\") pod \"calico-kube-controllers-6b97fd59-4xf4h\" (UID: \"6c115690-4c4b-4f0d-9563-71f7671c0428\") " pod="calico-system/calico-kube-controllers-6b97fd59-4xf4h" Dec 16 12:25:41.693892 kubelet[2858]: I1216 12:25:41.693807 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5eca681-3514-4805-908a-df03e7d148ad-config\") pod \"goldmane-7c778bb748-5kz8n\" (UID: \"e5eca681-3514-4805-908a-df03e7d148ad\") " pod="calico-system/goldmane-7c778bb748-5kz8n" Dec 16 12:25:41.693892 kubelet[2858]: I1216 12:25:41.693826 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/1a05c39e-610b-4af0-af9e-5a096e11d6cf-whisker-backend-key-pair\") pod \"whisker-5c8d8449c7-5bxg5\" (UID: \"1a05c39e-610b-4af0-af9e-5a096e11d6cf\") " pod="calico-system/whisker-5c8d8449c7-5bxg5" Dec 16 12:25:41.693892 kubelet[2858]: I1216 12:25:41.693843 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-httfn\" (UniqueName: \"kubernetes.io/projected/07aaeefc-07f0-4e68-bf3e-404493256fb6-kube-api-access-httfn\") pod \"coredns-66bc5c9577-9mpwx\" (UID: \"07aaeefc-07f0-4e68-bf3e-404493256fb6\") " pod="kube-system/coredns-66bc5c9577-9mpwx" Dec 16 12:25:41.693892 kubelet[2858]: I1216 12:25:41.693857 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tklqv\" (UniqueName: \"kubernetes.io/projected/2053ff95-314d-4312-973b-a00f2cf38258-kube-api-access-tklqv\") pod \"calico-apiserver-79c4b989f5-ctclc\" (UID: \"2053ff95-314d-4312-973b-a00f2cf38258\") " pod="calico-apiserver/calico-apiserver-79c4b989f5-ctclc" Dec 16 12:25:41.693892 kubelet[2858]: I1216 12:25:41.693873 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6c115690-4c4b-4f0d-9563-71f7671c0428-tigera-ca-bundle\") pod \"calico-kube-controllers-6b97fd59-4xf4h\" (UID: \"6c115690-4c4b-4f0d-9563-71f7671c0428\") " pod="calico-system/calico-kube-controllers-6b97fd59-4xf4h" Dec 16 12:25:41.693996 kubelet[2858]: I1216 12:25:41.693887 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a05c39e-610b-4af0-af9e-5a096e11d6cf-whisker-ca-bundle\") pod \"whisker-5c8d8449c7-5bxg5\" (UID: \"1a05c39e-610b-4af0-af9e-5a096e11d6cf\") " pod="calico-system/whisker-5c8d8449c7-5bxg5" Dec 16 12:25:41.693996 kubelet[2858]: I1216 12:25:41.693900 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlv4x\" (UniqueName: \"kubernetes.io/projected/b72f115e-557a-48f9-b3b7-382ad8af5dee-kube-api-access-vlv4x\") pod \"coredns-66bc5c9577-snrff\" (UID: \"b72f115e-557a-48f9-b3b7-382ad8af5dee\") " pod="kube-system/coredns-66bc5c9577-snrff" Dec 16 12:25:41.693996 kubelet[2858]: I1216 12:25:41.693913 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/2053ff95-314d-4312-973b-a00f2cf38258-calico-apiserver-certs\") pod \"calico-apiserver-79c4b989f5-ctclc\" (UID: \"2053ff95-314d-4312-973b-a00f2cf38258\") " pod="calico-apiserver/calico-apiserver-79c4b989f5-ctclc" Dec 16 12:25:41.693996 kubelet[2858]: I1216 12:25:41.693930 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/e5eca681-3514-4805-908a-df03e7d148ad-goldmane-key-pair\") pod \"goldmane-7c778bb748-5kz8n\" (UID: \"e5eca681-3514-4805-908a-df03e7d148ad\") " pod="calico-system/goldmane-7c778bb748-5kz8n" Dec 16 12:25:41.700804 systemd[1]: Created slice kubepods-besteffort-pode5eca681_3514_4805_908a_df03e7d148ad.slice - libcontainer container kubepods-besteffort-pode5eca681_3514_4805_908a_df03e7d148ad.slice. Dec 16 12:25:41.711900 systemd[1]: Created slice kubepods-besteffort-pod1a05c39e_610b_4af0_af9e_5a096e11d6cf.slice - libcontainer container kubepods-besteffort-pod1a05c39e_610b_4af0_af9e_5a096e11d6cf.slice. Dec 16 12:25:41.727060 systemd[1]: Created slice kubepods-besteffort-pod91bf0287_40fc_4bcc_aa76_a7888b9a94ef.slice - libcontainer container kubepods-besteffort-pod91bf0287_40fc_4bcc_aa76_a7888b9a94ef.slice. Dec 16 12:25:41.794497 kubelet[2858]: I1216 12:25:41.794335 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/91bf0287-40fc-4bcc-aa76-a7888b9a94ef-calico-apiserver-certs\") pod \"calico-apiserver-79c4b989f5-tzbch\" (UID: \"91bf0287-40fc-4bcc-aa76-a7888b9a94ef\") " pod="calico-apiserver/calico-apiserver-79c4b989f5-tzbch" Dec 16 12:25:41.796162 kubelet[2858]: I1216 12:25:41.796120 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8vgp\" (UniqueName: \"kubernetes.io/projected/91bf0287-40fc-4bcc-aa76-a7888b9a94ef-kube-api-access-c8vgp\") pod \"calico-apiserver-79c4b989f5-tzbch\" (UID: \"91bf0287-40fc-4bcc-aa76-a7888b9a94ef\") " pod="calico-apiserver/calico-apiserver-79c4b989f5-tzbch" Dec 16 12:25:41.967964 containerd[1611]: time="2025-12-16T12:25:41.967532744Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-snrff,Uid:b72f115e-557a-48f9-b3b7-382ad8af5dee,Namespace:kube-system,Attempt:0,}" Dec 16 12:25:41.981637 containerd[1611]: time="2025-12-16T12:25:41.981342439Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-9mpwx,Uid:07aaeefc-07f0-4e68-bf3e-404493256fb6,Namespace:kube-system,Attempt:0,}" Dec 16 12:25:41.992323 containerd[1611]: time="2025-12-16T12:25:41.992044350Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6b97fd59-4xf4h,Uid:6c115690-4c4b-4f0d-9563-71f7671c0428,Namespace:calico-system,Attempt:0,}" Dec 16 12:25:42.003584 containerd[1611]: time="2025-12-16T12:25:42.003535536Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-79c4b989f5-ctclc,Uid:2053ff95-314d-4312-973b-a00f2cf38258,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:25:42.009773 containerd[1611]: time="2025-12-16T12:25:42.009039511Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-5kz8n,Uid:e5eca681-3514-4805-908a-df03e7d148ad,Namespace:calico-system,Attempt:0,}" Dec 16 12:25:42.037668 containerd[1611]: time="2025-12-16T12:25:42.037555181Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-79c4b989f5-tzbch,Uid:91bf0287-40fc-4bcc-aa76-a7888b9a94ef,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:25:42.043831 containerd[1611]: time="2025-12-16T12:25:42.042871877Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5c8d8449c7-5bxg5,Uid:1a05c39e-610b-4af0-af9e-5a096e11d6cf,Namespace:calico-system,Attempt:0,}" Dec 16 12:25:42.048553 systemd[1]: Created slice kubepods-besteffort-pod3de277ef_70c9_4b08_8b83_d92a9680c7b8.slice - libcontainer container kubepods-besteffort-pod3de277ef_70c9_4b08_8b83_d92a9680c7b8.slice. Dec 16 12:25:42.055878 containerd[1611]: time="2025-12-16T12:25:42.055658939Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-psf64,Uid:3de277ef-70c9-4b08-8b83-d92a9680c7b8,Namespace:calico-system,Attempt:0,}" Dec 16 12:25:42.203551 containerd[1611]: time="2025-12-16T12:25:42.203490184Z" level=error msg="Failed to destroy network for sandbox \"6c8e527cdd0d250453c6a3b0d6b7c6dca3765fff37005a121b82ee08c5a31e1f\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:25:42.214313 containerd[1611]: time="2025-12-16T12:25:42.213800657Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-snrff,Uid:b72f115e-557a-48f9-b3b7-382ad8af5dee,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c8e527cdd0d250453c6a3b0d6b7c6dca3765fff37005a121b82ee08c5a31e1f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:25:42.214585 kubelet[2858]: E1216 12:25:42.214062 2858 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c8e527cdd0d250453c6a3b0d6b7c6dca3765fff37005a121b82ee08c5a31e1f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:25:42.214585 kubelet[2858]: E1216 12:25:42.214124 2858 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c8e527cdd0d250453c6a3b0d6b7c6dca3765fff37005a121b82ee08c5a31e1f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-snrff" Dec 16 12:25:42.214585 kubelet[2858]: E1216 12:25:42.214144 2858 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6c8e527cdd0d250453c6a3b0d6b7c6dca3765fff37005a121b82ee08c5a31e1f\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-snrff" Dec 16 12:25:42.215948 kubelet[2858]: E1216 12:25:42.214199 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-snrff_kube-system(b72f115e-557a-48f9-b3b7-382ad8af5dee)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-snrff_kube-system(b72f115e-557a-48f9-b3b7-382ad8af5dee)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6c8e527cdd0d250453c6a3b0d6b7c6dca3765fff37005a121b82ee08c5a31e1f\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-snrff" podUID="b72f115e-557a-48f9-b3b7-382ad8af5dee" Dec 16 12:25:42.230284 containerd[1611]: time="2025-12-16T12:25:42.230220142Z" level=error msg="Failed to destroy network for sandbox \"a3d03ff4c23f6ffc80001a5278955f9b04721e04a3f52e30d7a805e24c36b2fa\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:25:42.239692 containerd[1611]: time="2025-12-16T12:25:42.239639219Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-psf64,Uid:3de277ef-70c9-4b08-8b83-d92a9680c7b8,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3d03ff4c23f6ffc80001a5278955f9b04721e04a3f52e30d7a805e24c36b2fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:25:42.240087 kubelet[2858]: E1216 12:25:42.240045 2858 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3d03ff4c23f6ffc80001a5278955f9b04721e04a3f52e30d7a805e24c36b2fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:25:42.240152 kubelet[2858]: E1216 12:25:42.240104 2858 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3d03ff4c23f6ffc80001a5278955f9b04721e04a3f52e30d7a805e24c36b2fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-psf64" Dec 16 12:25:42.240152 kubelet[2858]: E1216 12:25:42.240123 2858 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"a3d03ff4c23f6ffc80001a5278955f9b04721e04a3f52e30d7a805e24c36b2fa\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-psf64" Dec 16 12:25:42.241825 kubelet[2858]: E1216 12:25:42.240647 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-psf64_calico-system(3de277ef-70c9-4b08-8b83-d92a9680c7b8)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-psf64_calico-system(3de277ef-70c9-4b08-8b83-d92a9680c7b8)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"a3d03ff4c23f6ffc80001a5278955f9b04721e04a3f52e30d7a805e24c36b2fa\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-psf64" podUID="3de277ef-70c9-4b08-8b83-d92a9680c7b8" Dec 16 12:25:42.250314 containerd[1611]: time="2025-12-16T12:25:42.250207011Z" level=error msg="Failed to destroy network for sandbox \"2848fa6f8ef3494d05d996df049baee2ca12b3a91642a466795500f26314ef52\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:25:42.254993 containerd[1611]: time="2025-12-16T12:25:42.254906510Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-9mpwx,Uid:07aaeefc-07f0-4e68-bf3e-404493256fb6,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2848fa6f8ef3494d05d996df049baee2ca12b3a91642a466795500f26314ef52\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:25:42.255379 kubelet[2858]: E1216 12:25:42.255082 2858 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2848fa6f8ef3494d05d996df049baee2ca12b3a91642a466795500f26314ef52\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:25:42.255379 kubelet[2858]: E1216 12:25:42.255128 2858 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2848fa6f8ef3494d05d996df049baee2ca12b3a91642a466795500f26314ef52\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-9mpwx" Dec 16 12:25:42.255379 kubelet[2858]: E1216 12:25:42.255146 2858 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2848fa6f8ef3494d05d996df049baee2ca12b3a91642a466795500f26314ef52\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-9mpwx" Dec 16 12:25:42.255537 kubelet[2858]: E1216 12:25:42.255199 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-9mpwx_kube-system(07aaeefc-07f0-4e68-bf3e-404493256fb6)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-9mpwx_kube-system(07aaeefc-07f0-4e68-bf3e-404493256fb6)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2848fa6f8ef3494d05d996df049baee2ca12b3a91642a466795500f26314ef52\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-9mpwx" podUID="07aaeefc-07f0-4e68-bf3e-404493256fb6" Dec 16 12:25:42.255868 containerd[1611]: time="2025-12-16T12:25:42.255751906Z" level=error msg="Failed to destroy network for sandbox \"815abdf11613bf185995e29f724ea639d70513f02b73877796aa0e8e85d7153e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:25:42.264057 containerd[1611]: time="2025-12-16T12:25:42.264002468Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Dec 16 12:25:42.266124 containerd[1611]: time="2025-12-16T12:25:42.264561386Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6b97fd59-4xf4h,Uid:6c115690-4c4b-4f0d-9563-71f7671c0428,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"815abdf11613bf185995e29f724ea639d70513f02b73877796aa0e8e85d7153e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:25:42.268325 kubelet[2858]: E1216 12:25:42.268224 2858 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"815abdf11613bf185995e29f724ea639d70513f02b73877796aa0e8e85d7153e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:25:42.268494 kubelet[2858]: E1216 12:25:42.268342 2858 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"815abdf11613bf185995e29f724ea639d70513f02b73877796aa0e8e85d7153e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6b97fd59-4xf4h" Dec 16 12:25:42.268494 kubelet[2858]: E1216 12:25:42.268374 2858 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"815abdf11613bf185995e29f724ea639d70513f02b73877796aa0e8e85d7153e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-6b97fd59-4xf4h" Dec 16 12:25:42.268889 kubelet[2858]: E1216 12:25:42.268700 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-6b97fd59-4xf4h_calico-system(6c115690-4c4b-4f0d-9563-71f7671c0428)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-6b97fd59-4xf4h_calico-system(6c115690-4c4b-4f0d-9563-71f7671c0428)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"815abdf11613bf185995e29f724ea639d70513f02b73877796aa0e8e85d7153e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-6b97fd59-4xf4h" podUID="6c115690-4c4b-4f0d-9563-71f7671c0428" Dec 16 12:25:42.290730 containerd[1611]: time="2025-12-16T12:25:42.290638187Z" level=error msg="Failed to destroy network for sandbox \"7274f9136045047dfcbc9f55837ccaf4d29777211eb17c8689c86e1ad030511e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:25:42.300187 containerd[1611]: time="2025-12-16T12:25:42.300064464Z" level=error msg="Failed to destroy network for sandbox \"5ca3f81bcfe5089bef7f3c5a7cb2c9fe3baddcfc1fdd16d86281729a12c33a43\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:25:42.304438 containerd[1611]: time="2025-12-16T12:25:42.304228405Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5c8d8449c7-5bxg5,Uid:1a05c39e-610b-4af0-af9e-5a096e11d6cf,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7274f9136045047dfcbc9f55837ccaf4d29777211eb17c8689c86e1ad030511e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:25:42.305545 kubelet[2858]: E1216 12:25:42.305200 2858 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7274f9136045047dfcbc9f55837ccaf4d29777211eb17c8689c86e1ad030511e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:25:42.305545 kubelet[2858]: E1216 12:25:42.305255 2858 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7274f9136045047dfcbc9f55837ccaf4d29777211eb17c8689c86e1ad030511e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5c8d8449c7-5bxg5" Dec 16 12:25:42.305545 kubelet[2858]: E1216 12:25:42.305291 2858 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7274f9136045047dfcbc9f55837ccaf4d29777211eb17c8689c86e1ad030511e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-5c8d8449c7-5bxg5" Dec 16 12:25:42.305758 kubelet[2858]: E1216 12:25:42.305726 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-5c8d8449c7-5bxg5_calico-system(1a05c39e-610b-4af0-af9e-5a096e11d6cf)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-5c8d8449c7-5bxg5_calico-system(1a05c39e-610b-4af0-af9e-5a096e11d6cf)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7274f9136045047dfcbc9f55837ccaf4d29777211eb17c8689c86e1ad030511e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-5c8d8449c7-5bxg5" podUID="1a05c39e-610b-4af0-af9e-5a096e11d6cf" Dec 16 12:25:42.306684 containerd[1611]: time="2025-12-16T12:25:42.306564194Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-5kz8n,Uid:e5eca681-3514-4805-908a-df03e7d148ad,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"5ca3f81bcfe5089bef7f3c5a7cb2c9fe3baddcfc1fdd16d86281729a12c33a43\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:25:42.307345 kubelet[2858]: E1216 12:25:42.307062 2858 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5ca3f81bcfe5089bef7f3c5a7cb2c9fe3baddcfc1fdd16d86281729a12c33a43\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:25:42.307495 kubelet[2858]: E1216 12:25:42.307459 2858 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5ca3f81bcfe5089bef7f3c5a7cb2c9fe3baddcfc1fdd16d86281729a12c33a43\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-5kz8n" Dec 16 12:25:42.307652 kubelet[2858]: E1216 12:25:42.307505 2858 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"5ca3f81bcfe5089bef7f3c5a7cb2c9fe3baddcfc1fdd16d86281729a12c33a43\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-5kz8n" Dec 16 12:25:42.307987 kubelet[2858]: E1216 12:25:42.307694 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7c778bb748-5kz8n_calico-system(e5eca681-3514-4805-908a-df03e7d148ad)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7c778bb748-5kz8n_calico-system(e5eca681-3514-4805-908a-df03e7d148ad)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"5ca3f81bcfe5089bef7f3c5a7cb2c9fe3baddcfc1fdd16d86281729a12c33a43\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7c778bb748-5kz8n" podUID="e5eca681-3514-4805-908a-df03e7d148ad" Dec 16 12:25:42.316090 containerd[1611]: time="2025-12-16T12:25:42.316042631Z" level=error msg="Failed to destroy network for sandbox \"7f5325f5eab54c4b4296449c6480d647940ee7e6272a012ebbcf3076b3d96aae\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:25:42.319395 containerd[1611]: time="2025-12-16T12:25:42.319217536Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-79c4b989f5-ctclc,Uid:2053ff95-314d-4312-973b-a00f2cf38258,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f5325f5eab54c4b4296449c6480d647940ee7e6272a012ebbcf3076b3d96aae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:25:42.319995 kubelet[2858]: E1216 12:25:42.319946 2858 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f5325f5eab54c4b4296449c6480d647940ee7e6272a012ebbcf3076b3d96aae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:25:42.320353 kubelet[2858]: E1216 12:25:42.320203 2858 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f5325f5eab54c4b4296449c6480d647940ee7e6272a012ebbcf3076b3d96aae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-79c4b989f5-ctclc" Dec 16 12:25:42.320353 kubelet[2858]: E1216 12:25:42.320239 2858 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"7f5325f5eab54c4b4296449c6480d647940ee7e6272a012ebbcf3076b3d96aae\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-79c4b989f5-ctclc" Dec 16 12:25:42.320353 kubelet[2858]: E1216 12:25:42.320310 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-79c4b989f5-ctclc_calico-apiserver(2053ff95-314d-4312-973b-a00f2cf38258)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-79c4b989f5-ctclc_calico-apiserver(2053ff95-314d-4312-973b-a00f2cf38258)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"7f5325f5eab54c4b4296449c6480d647940ee7e6272a012ebbcf3076b3d96aae\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-ctclc" podUID="2053ff95-314d-4312-973b-a00f2cf38258" Dec 16 12:25:42.320543 containerd[1611]: time="2025-12-16T12:25:42.320223812Z" level=error msg="Failed to destroy network for sandbox \"2405e65dd642cfa75ebbedb0d5e99e63f43d7d1b332b0efe8c18ce5d197de9df\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:25:42.322977 containerd[1611]: time="2025-12-16T12:25:42.322910360Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-79c4b989f5-tzbch,Uid:91bf0287-40fc-4bcc-aa76-a7888b9a94ef,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"2405e65dd642cfa75ebbedb0d5e99e63f43d7d1b332b0efe8c18ce5d197de9df\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:25:42.323340 kubelet[2858]: E1216 12:25:42.323309 2858 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2405e65dd642cfa75ebbedb0d5e99e63f43d7d1b332b0efe8c18ce5d197de9df\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Dec 16 12:25:42.323991 kubelet[2858]: E1216 12:25:42.323963 2858 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2405e65dd642cfa75ebbedb0d5e99e63f43d7d1b332b0efe8c18ce5d197de9df\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-79c4b989f5-tzbch" Dec 16 12:25:42.324303 kubelet[2858]: E1216 12:25:42.324077 2858 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"2405e65dd642cfa75ebbedb0d5e99e63f43d7d1b332b0efe8c18ce5d197de9df\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-79c4b989f5-tzbch" Dec 16 12:25:42.324303 kubelet[2858]: E1216 12:25:42.324139 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-79c4b989f5-tzbch_calico-apiserver(91bf0287-40fc-4bcc-aa76-a7888b9a94ef)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-79c4b989f5-tzbch_calico-apiserver(91bf0287-40fc-4bcc-aa76-a7888b9a94ef)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"2405e65dd642cfa75ebbedb0d5e99e63f43d7d1b332b0efe8c18ce5d197de9df\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-tzbch" podUID="91bf0287-40fc-4bcc-aa76-a7888b9a94ef" Dec 16 12:25:42.356709 kubelet[2858]: I1216 12:25:42.356638 2858 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Dec 16 12:25:42.410000 audit[3881]: NETFILTER_CFG table=filter:117 family=2 entries=21 op=nft_register_rule pid=3881 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:42.410000 audit[3881]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffe03e61c0 a2=0 a3=1 items=0 ppid=2994 pid=3881 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:42.414565 kernel: audit: type=1325 audit(1765887942.410:564): table=filter:117 family=2 entries=21 op=nft_register_rule pid=3881 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:42.414758 kernel: audit: type=1300 audit(1765887942.410:564): arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffe03e61c0 a2=0 a3=1 items=0 ppid=2994 pid=3881 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:42.414790 kernel: audit: type=1327 audit(1765887942.410:564): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:42.410000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:42.422000 audit[3881]: NETFILTER_CFG table=nat:118 family=2 entries=19 op=nft_register_chain pid=3881 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:42.422000 audit[3881]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6276 a0=3 a1=ffffe03e61c0 a2=0 a3=1 items=0 ppid=2994 pid=3881 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:42.428533 kernel: audit: type=1325 audit(1765887942.422:565): table=nat:118 family=2 entries=19 op=nft_register_chain pid=3881 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:42.428618 kernel: audit: type=1300 audit(1765887942.422:565): arch=c00000b7 syscall=211 success=yes exit=6276 a0=3 a1=ffffe03e61c0 a2=0 a3=1 items=0 ppid=2994 pid=3881 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:42.428643 kernel: audit: type=1327 audit(1765887942.422:565): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:42.422000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:42.898678 systemd[1]: run-netns-cni\x2db08d05df\x2db692\x2dbc09\x2d2579\x2d8d6ca5af1f2e.mount: Deactivated successfully. Dec 16 12:25:42.898803 systemd[1]: run-netns-cni\x2d90417880\x2da069\x2d853f\x2d704e\x2de3461c56120f.mount: Deactivated successfully. Dec 16 12:25:48.159715 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount4049900570.mount: Deactivated successfully. Dec 16 12:25:48.197046 containerd[1611]: time="2025-12-16T12:25:48.195910055Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:25:48.197046 containerd[1611]: time="2025-12-16T12:25:48.196974811Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=150930912" Dec 16 12:25:48.197836 containerd[1611]: time="2025-12-16T12:25:48.197689328Z" level=info msg="ImageCreate event name:\"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:25:48.200045 containerd[1611]: time="2025-12-16T12:25:48.199977359Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Dec 16 12:25:48.200946 containerd[1611]: time="2025-12-16T12:25:48.200910115Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"150934424\" in 5.934597937s" Dec 16 12:25:48.200946 containerd[1611]: time="2025-12-16T12:25:48.200941595Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\"" Dec 16 12:25:48.224092 containerd[1611]: time="2025-12-16T12:25:48.224039621Z" level=info msg="CreateContainer within sandbox \"e1ca992b2858d3c4adfd22bed9d871162419badbfc1721630d137d8e97fb8640\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Dec 16 12:25:48.233480 containerd[1611]: time="2025-12-16T12:25:48.233435142Z" level=info msg="Container 6472a2362db670dbed05ec4861cfeb1a182b0dbc304086b36e138da38377188e: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:25:48.239143 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount1228749649.mount: Deactivated successfully. Dec 16 12:25:48.249317 containerd[1611]: time="2025-12-16T12:25:48.249185958Z" level=info msg="CreateContainer within sandbox \"e1ca992b2858d3c4adfd22bed9d871162419badbfc1721630d137d8e97fb8640\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"6472a2362db670dbed05ec4861cfeb1a182b0dbc304086b36e138da38377188e\"" Dec 16 12:25:48.250642 containerd[1611]: time="2025-12-16T12:25:48.250514713Z" level=info msg="StartContainer for \"6472a2362db670dbed05ec4861cfeb1a182b0dbc304086b36e138da38377188e\"" Dec 16 12:25:48.255403 containerd[1611]: time="2025-12-16T12:25:48.255216734Z" level=info msg="connecting to shim 6472a2362db670dbed05ec4861cfeb1a182b0dbc304086b36e138da38377188e" address="unix:///run/containerd/s/153f2ddaea1a76f2570321eac10f98f292d61408632c786b28f66d4e6df78e5f" protocol=ttrpc version=3 Dec 16 12:25:48.324578 systemd[1]: Started cri-containerd-6472a2362db670dbed05ec4861cfeb1a182b0dbc304086b36e138da38377188e.scope - libcontainer container 6472a2362db670dbed05ec4861cfeb1a182b0dbc304086b36e138da38377188e. Dec 16 12:25:48.394000 audit: BPF prog-id=170 op=LOAD Dec 16 12:25:48.394000 audit[3888]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=3365 pid=3888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:48.399470 kernel: audit: type=1334 audit(1765887948.394:566): prog-id=170 op=LOAD Dec 16 12:25:48.399592 kernel: audit: type=1300 audit(1765887948.394:566): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=3365 pid=3888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:48.399626 kernel: audit: type=1327 audit(1765887948.394:566): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634373261323336326462363730646265643035656334383631636665 Dec 16 12:25:48.394000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634373261323336326462363730646265643035656334383631636665 Dec 16 12:25:48.404355 kernel: audit: type=1334 audit(1765887948.394:567): prog-id=171 op=LOAD Dec 16 12:25:48.394000 audit: BPF prog-id=171 op=LOAD Dec 16 12:25:48.394000 audit[3888]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=3365 pid=3888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:48.407970 kernel: audit: type=1300 audit(1765887948.394:567): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=3365 pid=3888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:48.408675 kernel: audit: type=1327 audit(1765887948.394:567): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634373261323336326462363730646265643035656334383631636665 Dec 16 12:25:48.394000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634373261323336326462363730646265643035656334383631636665 Dec 16 12:25:48.411640 kernel: audit: type=1334 audit(1765887948.394:568): prog-id=171 op=UNLOAD Dec 16 12:25:48.415848 kernel: audit: type=1300 audit(1765887948.394:568): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3365 pid=3888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:48.416027 kernel: audit: type=1327 audit(1765887948.394:568): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634373261323336326462363730646265643035656334383631636665 Dec 16 12:25:48.394000 audit: BPF prog-id=171 op=UNLOAD Dec 16 12:25:48.394000 audit[3888]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3365 pid=3888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:48.394000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634373261323336326462363730646265643035656334383631636665 Dec 16 12:25:48.394000 audit: BPF prog-id=170 op=UNLOAD Dec 16 12:25:48.417658 kernel: audit: type=1334 audit(1765887948.394:569): prog-id=170 op=UNLOAD Dec 16 12:25:48.394000 audit[3888]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3365 pid=3888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:48.394000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634373261323336326462363730646265643035656334383631636665 Dec 16 12:25:48.394000 audit: BPF prog-id=172 op=LOAD Dec 16 12:25:48.394000 audit[3888]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000176648 a2=98 a3=0 items=0 ppid=3365 pid=3888 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:48.394000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3634373261323336326462363730646265643035656334383631636665 Dec 16 12:25:48.448535 containerd[1611]: time="2025-12-16T12:25:48.448488266Z" level=info msg="StartContainer for \"6472a2362db670dbed05ec4861cfeb1a182b0dbc304086b36e138da38377188e\" returns successfully" Dec 16 12:25:48.609330 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Dec 16 12:25:48.609493 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Dec 16 12:25:48.846288 kubelet[2858]: I1216 12:25:48.846210 2858 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tl96g\" (UniqueName: \"kubernetes.io/projected/1a05c39e-610b-4af0-af9e-5a096e11d6cf-kube-api-access-tl96g\") pod \"1a05c39e-610b-4af0-af9e-5a096e11d6cf\" (UID: \"1a05c39e-610b-4af0-af9e-5a096e11d6cf\") " Dec 16 12:25:48.847193 kubelet[2858]: I1216 12:25:48.847047 2858 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a05c39e-610b-4af0-af9e-5a096e11d6cf-whisker-ca-bundle\") pod \"1a05c39e-610b-4af0-af9e-5a096e11d6cf\" (UID: \"1a05c39e-610b-4af0-af9e-5a096e11d6cf\") " Dec 16 12:25:48.847193 kubelet[2858]: I1216 12:25:48.847136 2858 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/1a05c39e-610b-4af0-af9e-5a096e11d6cf-whisker-backend-key-pair\") pod \"1a05c39e-610b-4af0-af9e-5a096e11d6cf\" (UID: \"1a05c39e-610b-4af0-af9e-5a096e11d6cf\") " Dec 16 12:25:48.849779 kubelet[2858]: I1216 12:25:48.849735 2858 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1a05c39e-610b-4af0-af9e-5a096e11d6cf-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "1a05c39e-610b-4af0-af9e-5a096e11d6cf" (UID: "1a05c39e-610b-4af0-af9e-5a096e11d6cf"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Dec 16 12:25:48.854722 kubelet[2858]: I1216 12:25:48.854452 2858 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1a05c39e-610b-4af0-af9e-5a096e11d6cf-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "1a05c39e-610b-4af0-af9e-5a096e11d6cf" (UID: "1a05c39e-610b-4af0-af9e-5a096e11d6cf"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Dec 16 12:25:48.858194 kubelet[2858]: I1216 12:25:48.857628 2858 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1a05c39e-610b-4af0-af9e-5a096e11d6cf-kube-api-access-tl96g" (OuterVolumeSpecName: "kube-api-access-tl96g") pod "1a05c39e-610b-4af0-af9e-5a096e11d6cf" (UID: "1a05c39e-610b-4af0-af9e-5a096e11d6cf"). InnerVolumeSpecName "kube-api-access-tl96g". PluginName "kubernetes.io/projected", VolumeGIDValue "" Dec 16 12:25:48.948351 kubelet[2858]: I1216 12:25:48.948258 2858 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tl96g\" (UniqueName: \"kubernetes.io/projected/1a05c39e-610b-4af0-af9e-5a096e11d6cf-kube-api-access-tl96g\") on node \"ci-4515-1-0-6-95bdd2e3e7\" DevicePath \"\"" Dec 16 12:25:48.948351 kubelet[2858]: I1216 12:25:48.948314 2858 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1a05c39e-610b-4af0-af9e-5a096e11d6cf-whisker-ca-bundle\") on node \"ci-4515-1-0-6-95bdd2e3e7\" DevicePath \"\"" Dec 16 12:25:48.948351 kubelet[2858]: I1216 12:25:48.948327 2858 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/1a05c39e-610b-4af0-af9e-5a096e11d6cf-whisker-backend-key-pair\") on node \"ci-4515-1-0-6-95bdd2e3e7\" DevicePath \"\"" Dec 16 12:25:49.050700 systemd[1]: Removed slice kubepods-besteffort-pod1a05c39e_610b_4af0_af9e_5a096e11d6cf.slice - libcontainer container kubepods-besteffort-pod1a05c39e_610b_4af0_af9e_5a096e11d6cf.slice. Dec 16 12:25:49.161022 systemd[1]: var-lib-kubelet-pods-1a05c39e\x2d610b\x2d4af0\x2daf9e\x2d5a096e11d6cf-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2dtl96g.mount: Deactivated successfully. Dec 16 12:25:49.162413 systemd[1]: var-lib-kubelet-pods-1a05c39e\x2d610b\x2d4af0\x2daf9e\x2d5a096e11d6cf-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Dec 16 12:25:49.398366 kubelet[2858]: I1216 12:25:49.398299 2858 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-kgg6v" podStartSLOduration=1.813827533 podStartE2EDuration="17.398280142s" podCreationTimestamp="2025-12-16 12:25:32 +0000 UTC" firstStartedPulling="2025-12-16 12:25:32.61803226 +0000 UTC m=+27.727537360" lastFinishedPulling="2025-12-16 12:25:48.202484869 +0000 UTC m=+43.311989969" observedRunningTime="2025-12-16 12:25:49.392179767 +0000 UTC m=+44.501684867" watchObservedRunningTime="2025-12-16 12:25:49.398280142 +0000 UTC m=+44.507785242" Dec 16 12:25:49.426054 systemd[1]: Created slice kubepods-besteffort-pod75b3ffde_bcfa_4d53_8c6e_f6a65e50e276.slice - libcontainer container kubepods-besteffort-pod75b3ffde_bcfa_4d53_8c6e_f6a65e50e276.slice. Dec 16 12:25:49.453501 kubelet[2858]: I1216 12:25:49.453457 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g46kf\" (UniqueName: \"kubernetes.io/projected/75b3ffde-bcfa-4d53-8c6e-f6a65e50e276-kube-api-access-g46kf\") pod \"whisker-5fd8987bcc-q5l6n\" (UID: \"75b3ffde-bcfa-4d53-8c6e-f6a65e50e276\") " pod="calico-system/whisker-5fd8987bcc-q5l6n" Dec 16 12:25:49.453501 kubelet[2858]: I1216 12:25:49.453511 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75b3ffde-bcfa-4d53-8c6e-f6a65e50e276-whisker-ca-bundle\") pod \"whisker-5fd8987bcc-q5l6n\" (UID: \"75b3ffde-bcfa-4d53-8c6e-f6a65e50e276\") " pod="calico-system/whisker-5fd8987bcc-q5l6n" Dec 16 12:25:49.453501 kubelet[2858]: I1216 12:25:49.453537 2858 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/75b3ffde-bcfa-4d53-8c6e-f6a65e50e276-whisker-backend-key-pair\") pod \"whisker-5fd8987bcc-q5l6n\" (UID: \"75b3ffde-bcfa-4d53-8c6e-f6a65e50e276\") " pod="calico-system/whisker-5fd8987bcc-q5l6n" Dec 16 12:25:49.655000 audit: BPF prog-id=173 op=LOAD Dec 16 12:25:49.655000 audit[4096]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffdd0a7078 a2=98 a3=ffffdd0a7068 items=0 ppid=3958 pid=4096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.655000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:25:49.655000 audit: BPF prog-id=173 op=UNLOAD Dec 16 12:25:49.655000 audit[4096]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffdd0a7048 a3=0 items=0 ppid=3958 pid=4096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.655000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:25:49.655000 audit: BPF prog-id=174 op=LOAD Dec 16 12:25:49.655000 audit[4096]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffdd0a6f28 a2=74 a3=95 items=0 ppid=3958 pid=4096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.655000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:25:49.656000 audit: BPF prog-id=174 op=UNLOAD Dec 16 12:25:49.656000 audit[4096]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=3958 pid=4096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.656000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:25:49.656000 audit: BPF prog-id=175 op=LOAD Dec 16 12:25:49.656000 audit[4096]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffdd0a6f58 a2=40 a3=ffffdd0a6f88 items=0 ppid=3958 pid=4096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.656000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:25:49.656000 audit: BPF prog-id=175 op=UNLOAD Dec 16 12:25:49.656000 audit[4096]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffdd0a6f88 items=0 ppid=3958 pid=4096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.656000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Dec 16 12:25:49.658000 audit: BPF prog-id=176 op=LOAD Dec 16 12:25:49.658000 audit[4097]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffff8cc688 a2=98 a3=ffffff8cc678 items=0 ppid=3958 pid=4097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.658000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:25:49.658000 audit: BPF prog-id=176 op=UNLOAD Dec 16 12:25:49.658000 audit[4097]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffff8cc658 a3=0 items=0 ppid=3958 pid=4097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.658000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:25:49.658000 audit: BPF prog-id=177 op=LOAD Dec 16 12:25:49.658000 audit[4097]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffff8cc318 a2=74 a3=95 items=0 ppid=3958 pid=4097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.658000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:25:49.659000 audit: BPF prog-id=177 op=UNLOAD Dec 16 12:25:49.659000 audit[4097]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=3958 pid=4097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.659000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:25:49.659000 audit: BPF prog-id=178 op=LOAD Dec 16 12:25:49.659000 audit[4097]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffff8cc378 a2=94 a3=2 items=0 ppid=3958 pid=4097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.659000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:25:49.659000 audit: BPF prog-id=178 op=UNLOAD Dec 16 12:25:49.659000 audit[4097]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=3958 pid=4097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.659000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:25:49.741773 containerd[1611]: time="2025-12-16T12:25:49.741726685Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5fd8987bcc-q5l6n,Uid:75b3ffde-bcfa-4d53-8c6e-f6a65e50e276,Namespace:calico-system,Attempt:0,}" Dec 16 12:25:49.778000 audit: BPF prog-id=179 op=LOAD Dec 16 12:25:49.778000 audit[4097]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffff8cc338 a2=40 a3=ffffff8cc368 items=0 ppid=3958 pid=4097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.778000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:25:49.778000 audit: BPF prog-id=179 op=UNLOAD Dec 16 12:25:49.778000 audit[4097]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=ffffff8cc368 items=0 ppid=3958 pid=4097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.778000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:25:49.788000 audit: BPF prog-id=180 op=LOAD Dec 16 12:25:49.788000 audit[4097]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffff8cc348 a2=94 a3=4 items=0 ppid=3958 pid=4097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.788000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:25:49.788000 audit: BPF prog-id=180 op=UNLOAD Dec 16 12:25:49.788000 audit[4097]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=3958 pid=4097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.788000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:25:49.788000 audit: BPF prog-id=181 op=LOAD Dec 16 12:25:49.788000 audit[4097]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffff8cc188 a2=94 a3=5 items=0 ppid=3958 pid=4097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.788000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:25:49.788000 audit: BPF prog-id=181 op=UNLOAD Dec 16 12:25:49.788000 audit[4097]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=3958 pid=4097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.788000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:25:49.788000 audit: BPF prog-id=182 op=LOAD Dec 16 12:25:49.788000 audit[4097]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffff8cc3b8 a2=94 a3=6 items=0 ppid=3958 pid=4097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.788000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:25:49.788000 audit: BPF prog-id=182 op=UNLOAD Dec 16 12:25:49.788000 audit[4097]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=3958 pid=4097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.788000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:25:49.789000 audit: BPF prog-id=183 op=LOAD Dec 16 12:25:49.789000 audit[4097]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffff8cbb88 a2=94 a3=83 items=0 ppid=3958 pid=4097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.789000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:25:49.789000 audit: BPF prog-id=184 op=LOAD Dec 16 12:25:49.789000 audit[4097]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=ffffff8cb948 a2=94 a3=2 items=0 ppid=3958 pid=4097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.789000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:25:49.789000 audit: BPF prog-id=184 op=UNLOAD Dec 16 12:25:49.789000 audit[4097]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=3958 pid=4097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.789000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:25:49.790000 audit: BPF prog-id=183 op=UNLOAD Dec 16 12:25:49.790000 audit[4097]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=25147620 a3=2513ab00 items=0 ppid=3958 pid=4097 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.790000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Dec 16 12:25:49.803000 audit: BPF prog-id=185 op=LOAD Dec 16 12:25:49.803000 audit[4111]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffe73f3f38 a2=98 a3=ffffe73f3f28 items=0 ppid=3958 pid=4111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.803000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:25:49.803000 audit: BPF prog-id=185 op=UNLOAD Dec 16 12:25:49.803000 audit[4111]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffe73f3f08 a3=0 items=0 ppid=3958 pid=4111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.803000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:25:49.803000 audit: BPF prog-id=186 op=LOAD Dec 16 12:25:49.803000 audit[4111]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffe73f3de8 a2=74 a3=95 items=0 ppid=3958 pid=4111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.803000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:25:49.803000 audit: BPF prog-id=186 op=UNLOAD Dec 16 12:25:49.803000 audit[4111]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=3958 pid=4111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.803000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:25:49.803000 audit: BPF prog-id=187 op=LOAD Dec 16 12:25:49.803000 audit[4111]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffe73f3e18 a2=40 a3=ffffe73f3e48 items=0 ppid=3958 pid=4111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.803000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:25:49.803000 audit: BPF prog-id=187 op=UNLOAD Dec 16 12:25:49.803000 audit[4111]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffe73f3e48 items=0 ppid=3958 pid=4111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.803000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Dec 16 12:25:49.900099 systemd-networkd[1501]: vxlan.calico: Link UP Dec 16 12:25:49.900106 systemd-networkd[1501]: vxlan.calico: Gained carrier Dec 16 12:25:49.945000 audit: BPF prog-id=188 op=LOAD Dec 16 12:25:49.945000 audit[4144]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd0177b98 a2=98 a3=ffffd0177b88 items=0 ppid=3958 pid=4144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.945000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:25:49.945000 audit: BPF prog-id=188 op=UNLOAD Dec 16 12:25:49.945000 audit[4144]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffd0177b68 a3=0 items=0 ppid=3958 pid=4144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.945000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:25:49.946000 audit: BPF prog-id=189 op=LOAD Dec 16 12:25:49.946000 audit[4144]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd0177878 a2=74 a3=95 items=0 ppid=3958 pid=4144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.946000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:25:49.946000 audit: BPF prog-id=189 op=UNLOAD Dec 16 12:25:49.946000 audit[4144]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=3958 pid=4144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.946000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:25:49.946000 audit: BPF prog-id=190 op=LOAD Dec 16 12:25:49.946000 audit[4144]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd01778d8 a2=94 a3=2 items=0 ppid=3958 pid=4144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.946000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:25:49.946000 audit: BPF prog-id=190 op=UNLOAD Dec 16 12:25:49.946000 audit[4144]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=70 a3=2 items=0 ppid=3958 pid=4144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.946000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:25:49.946000 audit: BPF prog-id=191 op=LOAD Dec 16 12:25:49.946000 audit[4144]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffd0177758 a2=40 a3=ffffd0177788 items=0 ppid=3958 pid=4144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.946000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:25:49.946000 audit: BPF prog-id=191 op=UNLOAD Dec 16 12:25:49.946000 audit[4144]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=40 a3=ffffd0177788 items=0 ppid=3958 pid=4144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.946000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:25:49.946000 audit: BPF prog-id=192 op=LOAD Dec 16 12:25:49.946000 audit[4144]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffd01778a8 a2=94 a3=b7 items=0 ppid=3958 pid=4144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.946000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:25:49.946000 audit: BPF prog-id=192 op=UNLOAD Dec 16 12:25:49.946000 audit[4144]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=b7 items=0 ppid=3958 pid=4144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.946000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:25:49.948000 audit: BPF prog-id=193 op=LOAD Dec 16 12:25:49.948000 audit[4144]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffd0176f58 a2=94 a3=2 items=0 ppid=3958 pid=4144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.948000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:25:49.948000 audit: BPF prog-id=193 op=UNLOAD Dec 16 12:25:49.948000 audit[4144]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=2 items=0 ppid=3958 pid=4144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.948000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:25:49.948000 audit: BPF prog-id=194 op=LOAD Dec 16 12:25:49.948000 audit[4144]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffd01770e8 a2=94 a3=30 items=0 ppid=3958 pid=4144 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.948000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Dec 16 12:25:49.956000 audit: BPF prog-id=195 op=LOAD Dec 16 12:25:49.956000 audit[4148]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffff10c398 a2=98 a3=ffffff10c388 items=0 ppid=3958 pid=4148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.956000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:25:49.956000 audit: BPF prog-id=195 op=UNLOAD Dec 16 12:25:49.956000 audit[4148]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffff10c368 a3=0 items=0 ppid=3958 pid=4148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.956000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:25:49.956000 audit: BPF prog-id=196 op=LOAD Dec 16 12:25:49.956000 audit[4148]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffff10c028 a2=74 a3=95 items=0 ppid=3958 pid=4148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.956000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:25:49.956000 audit: BPF prog-id=196 op=UNLOAD Dec 16 12:25:49.956000 audit[4148]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=3958 pid=4148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.956000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:25:49.956000 audit: BPF prog-id=197 op=LOAD Dec 16 12:25:49.956000 audit[4148]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffff10c088 a2=94 a3=2 items=0 ppid=3958 pid=4148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.956000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:25:49.956000 audit: BPF prog-id=197 op=UNLOAD Dec 16 12:25:49.956000 audit[4148]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=3958 pid=4148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:49.956000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:25:50.016389 systemd-networkd[1501]: cali8850193e256: Link UP Dec 16 12:25:50.019630 systemd-networkd[1501]: cali8850193e256: Gained carrier Dec 16 12:25:50.044583 containerd[1611]: 2025-12-16 12:25:49.844 [INFO][4098] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--6--95bdd2e3e7-k8s-whisker--5fd8987bcc--q5l6n-eth0 whisker-5fd8987bcc- calico-system 75b3ffde-bcfa-4d53-8c6e-f6a65e50e276 929 0 2025-12-16 12:25:49 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:5fd8987bcc projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4515-1-0-6-95bdd2e3e7 whisker-5fd8987bcc-q5l6n eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali8850193e256 [] [] }} ContainerID="59989053c0254db35e4c36df9ea7eae4cbf0dfc12029dc3756e717f3203b9607" Namespace="calico-system" Pod="whisker-5fd8987bcc-q5l6n" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-whisker--5fd8987bcc--q5l6n-" Dec 16 12:25:50.044583 containerd[1611]: 2025-12-16 12:25:49.845 [INFO][4098] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="59989053c0254db35e4c36df9ea7eae4cbf0dfc12029dc3756e717f3203b9607" Namespace="calico-system" Pod="whisker-5fd8987bcc-q5l6n" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-whisker--5fd8987bcc--q5l6n-eth0" Dec 16 12:25:50.044583 containerd[1611]: 2025-12-16 12:25:49.924 [INFO][4125] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="59989053c0254db35e4c36df9ea7eae4cbf0dfc12029dc3756e717f3203b9607" HandleID="k8s-pod-network.59989053c0254db35e4c36df9ea7eae4cbf0dfc12029dc3756e717f3203b9607" Workload="ci--4515--1--0--6--95bdd2e3e7-k8s-whisker--5fd8987bcc--q5l6n-eth0" Dec 16 12:25:50.044583 containerd[1611]: 2025-12-16 12:25:49.924 [INFO][4125] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="59989053c0254db35e4c36df9ea7eae4cbf0dfc12029dc3756e717f3203b9607" HandleID="k8s-pod-network.59989053c0254db35e4c36df9ea7eae4cbf0dfc12029dc3756e717f3203b9607" Workload="ci--4515--1--0--6--95bdd2e3e7-k8s-whisker--5fd8987bcc--q5l6n-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d35a0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4515-1-0-6-95bdd2e3e7", "pod":"whisker-5fd8987bcc-q5l6n", "timestamp":"2025-12-16 12:25:49.924303233 +0000 UTC"}, Hostname:"ci-4515-1-0-6-95bdd2e3e7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:25:50.044583 containerd[1611]: 2025-12-16 12:25:49.924 [INFO][4125] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:25:50.044583 containerd[1611]: 2025-12-16 12:25:49.924 [INFO][4125] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:25:50.044583 containerd[1611]: 2025-12-16 12:25:49.924 [INFO][4125] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-6-95bdd2e3e7' Dec 16 12:25:50.044583 containerd[1611]: 2025-12-16 12:25:49.942 [INFO][4125] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.59989053c0254db35e4c36df9ea7eae4cbf0dfc12029dc3756e717f3203b9607" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:50.044583 containerd[1611]: 2025-12-16 12:25:49.964 [INFO][4125] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:50.044583 containerd[1611]: 2025-12-16 12:25:49.972 [INFO][4125] ipam/ipam.go 511: Trying affinity for 192.168.63.64/26 host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:50.044583 containerd[1611]: 2025-12-16 12:25:49.975 [INFO][4125] ipam/ipam.go 158: Attempting to load block cidr=192.168.63.64/26 host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:50.044583 containerd[1611]: 2025-12-16 12:25:49.978 [INFO][4125] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.63.64/26 host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:50.044583 containerd[1611]: 2025-12-16 12:25:49.978 [INFO][4125] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.63.64/26 handle="k8s-pod-network.59989053c0254db35e4c36df9ea7eae4cbf0dfc12029dc3756e717f3203b9607" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:50.044583 containerd[1611]: 2025-12-16 12:25:49.980 [INFO][4125] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.59989053c0254db35e4c36df9ea7eae4cbf0dfc12029dc3756e717f3203b9607 Dec 16 12:25:50.044583 containerd[1611]: 2025-12-16 12:25:49.989 [INFO][4125] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.63.64/26 handle="k8s-pod-network.59989053c0254db35e4c36df9ea7eae4cbf0dfc12029dc3756e717f3203b9607" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:50.044583 containerd[1611]: 2025-12-16 12:25:49.997 [INFO][4125] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.63.65/26] block=192.168.63.64/26 handle="k8s-pod-network.59989053c0254db35e4c36df9ea7eae4cbf0dfc12029dc3756e717f3203b9607" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:50.044583 containerd[1611]: 2025-12-16 12:25:49.997 [INFO][4125] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.63.65/26] handle="k8s-pod-network.59989053c0254db35e4c36df9ea7eae4cbf0dfc12029dc3756e717f3203b9607" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:50.044583 containerd[1611]: 2025-12-16 12:25:49.997 [INFO][4125] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:25:50.044583 containerd[1611]: 2025-12-16 12:25:49.997 [INFO][4125] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.63.65/26] IPv6=[] ContainerID="59989053c0254db35e4c36df9ea7eae4cbf0dfc12029dc3756e717f3203b9607" HandleID="k8s-pod-network.59989053c0254db35e4c36df9ea7eae4cbf0dfc12029dc3756e717f3203b9607" Workload="ci--4515--1--0--6--95bdd2e3e7-k8s-whisker--5fd8987bcc--q5l6n-eth0" Dec 16 12:25:50.046600 containerd[1611]: 2025-12-16 12:25:50.001 [INFO][4098] cni-plugin/k8s.go 418: Populated endpoint ContainerID="59989053c0254db35e4c36df9ea7eae4cbf0dfc12029dc3756e717f3203b9607" Namespace="calico-system" Pod="whisker-5fd8987bcc-q5l6n" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-whisker--5fd8987bcc--q5l6n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--6--95bdd2e3e7-k8s-whisker--5fd8987bcc--q5l6n-eth0", GenerateName:"whisker-5fd8987bcc-", Namespace:"calico-system", SelfLink:"", UID:"75b3ffde-bcfa-4d53-8c6e-f6a65e50e276", ResourceVersion:"929", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 25, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5fd8987bcc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-6-95bdd2e3e7", ContainerID:"", Pod:"whisker-5fd8987bcc-q5l6n", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.63.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali8850193e256", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:25:50.046600 containerd[1611]: 2025-12-16 12:25:50.001 [INFO][4098] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.63.65/32] ContainerID="59989053c0254db35e4c36df9ea7eae4cbf0dfc12029dc3756e717f3203b9607" Namespace="calico-system" Pod="whisker-5fd8987bcc-q5l6n" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-whisker--5fd8987bcc--q5l6n-eth0" Dec 16 12:25:50.046600 containerd[1611]: 2025-12-16 12:25:50.001 [INFO][4098] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali8850193e256 ContainerID="59989053c0254db35e4c36df9ea7eae4cbf0dfc12029dc3756e717f3203b9607" Namespace="calico-system" Pod="whisker-5fd8987bcc-q5l6n" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-whisker--5fd8987bcc--q5l6n-eth0" Dec 16 12:25:50.046600 containerd[1611]: 2025-12-16 12:25:50.018 [INFO][4098] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="59989053c0254db35e4c36df9ea7eae4cbf0dfc12029dc3756e717f3203b9607" Namespace="calico-system" Pod="whisker-5fd8987bcc-q5l6n" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-whisker--5fd8987bcc--q5l6n-eth0" Dec 16 12:25:50.046600 containerd[1611]: 2025-12-16 12:25:50.022 [INFO][4098] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="59989053c0254db35e4c36df9ea7eae4cbf0dfc12029dc3756e717f3203b9607" Namespace="calico-system" Pod="whisker-5fd8987bcc-q5l6n" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-whisker--5fd8987bcc--q5l6n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--6--95bdd2e3e7-k8s-whisker--5fd8987bcc--q5l6n-eth0", GenerateName:"whisker-5fd8987bcc-", Namespace:"calico-system", SelfLink:"", UID:"75b3ffde-bcfa-4d53-8c6e-f6a65e50e276", ResourceVersion:"929", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 25, 49, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"5fd8987bcc", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-6-95bdd2e3e7", ContainerID:"59989053c0254db35e4c36df9ea7eae4cbf0dfc12029dc3756e717f3203b9607", Pod:"whisker-5fd8987bcc-q5l6n", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.63.65/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali8850193e256", MAC:"62:a4:25:66:53:a9", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:25:50.046600 containerd[1611]: 2025-12-16 12:25:50.040 [INFO][4098] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="59989053c0254db35e4c36df9ea7eae4cbf0dfc12029dc3756e717f3203b9607" Namespace="calico-system" Pod="whisker-5fd8987bcc-q5l6n" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-whisker--5fd8987bcc--q5l6n-eth0" Dec 16 12:25:50.111057 containerd[1611]: time="2025-12-16T12:25:50.110690692Z" level=info msg="connecting to shim 59989053c0254db35e4c36df9ea7eae4cbf0dfc12029dc3756e717f3203b9607" address="unix:///run/containerd/s/3ea3bca1c9f9af3fce6e83c57ae44bcb4028a4d7e357965286463a649f93afdd" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:25:50.140659 systemd[1]: Started cri-containerd-59989053c0254db35e4c36df9ea7eae4cbf0dfc12029dc3756e717f3203b9607.scope - libcontainer container 59989053c0254db35e4c36df9ea7eae4cbf0dfc12029dc3756e717f3203b9607. Dec 16 12:25:50.152000 audit: BPF prog-id=198 op=LOAD Dec 16 12:25:50.152000 audit[4148]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffff10c048 a2=40 a3=ffffff10c078 items=0 ppid=3958 pid=4148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:50.152000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:25:50.153000 audit: BPF prog-id=198 op=UNLOAD Dec 16 12:25:50.153000 audit[4148]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=ffffff10c078 items=0 ppid=3958 pid=4148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:50.153000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:25:50.169000 audit: BPF prog-id=199 op=LOAD Dec 16 12:25:50.169000 audit[4148]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffff10c058 a2=94 a3=4 items=0 ppid=3958 pid=4148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:50.169000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:25:50.169000 audit: BPF prog-id=199 op=UNLOAD Dec 16 12:25:50.169000 audit[4148]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=3958 pid=4148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:50.169000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:25:50.169000 audit: BPF prog-id=200 op=LOAD Dec 16 12:25:50.169000 audit[4148]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffff10be98 a2=94 a3=5 items=0 ppid=3958 pid=4148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:50.169000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:25:50.169000 audit: BPF prog-id=200 op=UNLOAD Dec 16 12:25:50.169000 audit[4148]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=3958 pid=4148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:50.169000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:25:50.169000 audit: BPF prog-id=201 op=LOAD Dec 16 12:25:50.169000 audit[4148]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffff10c0c8 a2=94 a3=6 items=0 ppid=3958 pid=4148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:50.169000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:25:50.169000 audit: BPF prog-id=201 op=UNLOAD Dec 16 12:25:50.169000 audit[4148]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=3958 pid=4148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:50.169000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:25:50.169000 audit: BPF prog-id=202 op=LOAD Dec 16 12:25:50.169000 audit[4148]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffff10b898 a2=94 a3=83 items=0 ppid=3958 pid=4148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:50.169000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:25:50.170000 audit: BPF prog-id=203 op=LOAD Dec 16 12:25:50.170000 audit[4148]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=ffffff10b658 a2=94 a3=2 items=0 ppid=3958 pid=4148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:50.170000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:25:50.170000 audit: BPF prog-id=203 op=UNLOAD Dec 16 12:25:50.170000 audit[4148]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=3958 pid=4148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:50.170000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:25:50.170000 audit: BPF prog-id=202 op=UNLOAD Dec 16 12:25:50.170000 audit[4148]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=1317c620 a3=1316fb00 items=0 ppid=3958 pid=4148 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:50.170000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Dec 16 12:25:50.174000 audit: BPF prog-id=204 op=LOAD Dec 16 12:25:50.175000 audit: BPF prog-id=205 op=LOAD Dec 16 12:25:50.175000 audit[4183]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=4172 pid=4183 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:50.175000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539393839303533633032353464623335653463333664663965613765 Dec 16 12:25:50.175000 audit: BPF prog-id=205 op=UNLOAD Dec 16 12:25:50.175000 audit[4183]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4172 pid=4183 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:50.175000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539393839303533633032353464623335653463333664663965613765 Dec 16 12:25:50.175000 audit: BPF prog-id=206 op=LOAD Dec 16 12:25:50.175000 audit[4183]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=4172 pid=4183 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:50.175000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539393839303533633032353464623335653463333664663965613765 Dec 16 12:25:50.175000 audit: BPF prog-id=207 op=LOAD Dec 16 12:25:50.175000 audit[4183]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a0168 a2=98 a3=0 items=0 ppid=4172 pid=4183 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:50.175000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539393839303533633032353464623335653463333664663965613765 Dec 16 12:25:50.175000 audit: BPF prog-id=207 op=UNLOAD Dec 16 12:25:50.175000 audit[4183]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4172 pid=4183 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:50.175000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539393839303533633032353464623335653463333664663965613765 Dec 16 12:25:50.175000 audit: BPF prog-id=206 op=UNLOAD Dec 16 12:25:50.175000 audit[4183]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4172 pid=4183 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:50.175000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539393839303533633032353464623335653463333664663965613765 Dec 16 12:25:50.175000 audit: BPF prog-id=208 op=LOAD Dec 16 12:25:50.175000 audit[4183]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0648 a2=98 a3=0 items=0 ppid=4172 pid=4183 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:50.175000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3539393839303533633032353464623335653463333664663965613765 Dec 16 12:25:50.181000 audit: BPF prog-id=194 op=UNLOAD Dec 16 12:25:50.181000 audit[3958]: SYSCALL arch=c00000b7 syscall=35 success=yes exit=0 a0=ffffffffffffff9c a1=40006dba00 a2=0 a3=0 items=0 ppid=3950 pid=3958 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:50.181000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Dec 16 12:25:50.211767 containerd[1611]: time="2025-12-16T12:25:50.211420695Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-5fd8987bcc-q5l6n,Uid:75b3ffde-bcfa-4d53-8c6e-f6a65e50e276,Namespace:calico-system,Attempt:0,} returns sandbox id \"59989053c0254db35e4c36df9ea7eae4cbf0dfc12029dc3756e717f3203b9607\"" Dec 16 12:25:50.215113 containerd[1611]: time="2025-12-16T12:25:50.215047200Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:25:50.269000 audit[4230]: NETFILTER_CFG table=nat:119 family=2 entries=15 op=nft_register_chain pid=4230 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:25:50.269000 audit[4230]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5084 a0=3 a1=ffffc4ec4bf0 a2=0 a3=ffff99466fa8 items=0 ppid=3958 pid=4230 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:50.269000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:25:50.270000 audit[4232]: NETFILTER_CFG table=mangle:120 family=2 entries=16 op=nft_register_chain pid=4232 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:25:50.270000 audit[4232]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6868 a0=3 a1=ffffdfd34000 a2=0 a3=ffff9f3ecfa8 items=0 ppid=3958 pid=4232 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:50.270000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:25:50.275000 audit[4231]: NETFILTER_CFG table=filter:121 family=2 entries=39 op=nft_register_chain pid=4231 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:25:50.275000 audit[4231]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=18968 a0=3 a1=ffffc11ffb90 a2=0 a3=ffff8f9dffa8 items=0 ppid=3958 pid=4231 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:50.275000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:25:50.286000 audit[4234]: NETFILTER_CFG table=raw:122 family=2 entries=21 op=nft_register_chain pid=4234 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:25:50.286000 audit[4234]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8452 a0=3 a1=ffffca3a80c0 a2=0 a3=ffffbbd13fa8 items=0 ppid=3958 pid=4234 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:50.286000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:25:50.334000 audit[4261]: NETFILTER_CFG table=filter:123 family=2 entries=59 op=nft_register_chain pid=4261 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:25:50.334000 audit[4261]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=35860 a0=3 a1=ffffff6189d0 a2=0 a3=ffff86f6bfa8 items=0 ppid=3958 pid=4261 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:50.334000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:25:50.552943 containerd[1611]: time="2025-12-16T12:25:50.552741427Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:25:50.555142 containerd[1611]: time="2025-12-16T12:25:50.555036178Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:25:50.555314 containerd[1611]: time="2025-12-16T12:25:50.555057418Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:25:50.555505 kubelet[2858]: E1216 12:25:50.555451 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:25:50.555941 kubelet[2858]: E1216 12:25:50.555520 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:25:50.555941 kubelet[2858]: E1216 12:25:50.555624 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-5fd8987bcc-q5l6n_calico-system(75b3ffde-bcfa-4d53-8c6e-f6a65e50e276): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:25:50.558011 containerd[1611]: time="2025-12-16T12:25:50.557917686Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:25:50.904856 containerd[1611]: time="2025-12-16T12:25:50.904665117Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:25:50.906750 containerd[1611]: time="2025-12-16T12:25:50.906645949Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:25:50.906920 containerd[1611]: time="2025-12-16T12:25:50.906706709Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:25:50.907136 kubelet[2858]: E1216 12:25:50.907049 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:25:50.907136 kubelet[2858]: E1216 12:25:50.907118 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:25:50.907546 kubelet[2858]: E1216 12:25:50.907440 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-5fd8987bcc-q5l6n_calico-system(75b3ffde-bcfa-4d53-8c6e-f6a65e50e276): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:25:50.907546 kubelet[2858]: E1216 12:25:50.907494 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5fd8987bcc-q5l6n" podUID="75b3ffde-bcfa-4d53-8c6e-f6a65e50e276" Dec 16 12:25:51.047677 kubelet[2858]: I1216 12:25:51.047617 2858 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1a05c39e-610b-4af0-af9e-5a096e11d6cf" path="/var/lib/kubelet/pods/1a05c39e-610b-4af0-af9e-5a096e11d6cf/volumes" Dec 16 12:25:51.313750 kubelet[2858]: E1216 12:25:51.313698 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5fd8987bcc-q5l6n" podUID="75b3ffde-bcfa-4d53-8c6e-f6a65e50e276" Dec 16 12:25:51.347000 audit[4274]: NETFILTER_CFG table=filter:124 family=2 entries=20 op=nft_register_rule pid=4274 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:51.347000 audit[4274]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffd45fa960 a2=0 a3=1 items=0 ppid=2994 pid=4274 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:51.347000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:51.355000 audit[4274]: NETFILTER_CFG table=nat:125 family=2 entries=14 op=nft_register_rule pid=4274 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:51.355000 audit[4274]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffd45fa960 a2=0 a3=1 items=0 ppid=2994 pid=4274 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:51.355000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:51.535011 systemd-networkd[1501]: vxlan.calico: Gained IPv6LL Dec 16 12:25:51.918727 systemd-networkd[1501]: cali8850193e256: Gained IPv6LL Dec 16 12:25:53.044516 containerd[1611]: time="2025-12-16T12:25:53.044035401Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-79c4b989f5-ctclc,Uid:2053ff95-314d-4312-973b-a00f2cf38258,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:25:53.047946 containerd[1611]: time="2025-12-16T12:25:53.047774387Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-5kz8n,Uid:e5eca681-3514-4805-908a-df03e7d148ad,Namespace:calico-system,Attempt:0,}" Dec 16 12:25:53.239986 systemd-networkd[1501]: cali2964272303b: Link UP Dec 16 12:25:53.241224 systemd-networkd[1501]: cali2964272303b: Gained carrier Dec 16 12:25:53.266667 containerd[1611]: 2025-12-16 12:25:53.125 [INFO][4279] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--6--95bdd2e3e7-k8s-calico--apiserver--79c4b989f5--ctclc-eth0 calico-apiserver-79c4b989f5- calico-apiserver 2053ff95-314d-4312-973b-a00f2cf38258 857 0 2025-12-16 12:25:24 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:79c4b989f5 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4515-1-0-6-95bdd2e3e7 calico-apiserver-79c4b989f5-ctclc eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali2964272303b [] [] }} ContainerID="bc8a10eec23de7508ca80348243acb68ee5a94b2f1e71a4f235dbda4638249cb" Namespace="calico-apiserver" Pod="calico-apiserver-79c4b989f5-ctclc" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--apiserver--79c4b989f5--ctclc-" Dec 16 12:25:53.266667 containerd[1611]: 2025-12-16 12:25:53.125 [INFO][4279] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="bc8a10eec23de7508ca80348243acb68ee5a94b2f1e71a4f235dbda4638249cb" Namespace="calico-apiserver" Pod="calico-apiserver-79c4b989f5-ctclc" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--apiserver--79c4b989f5--ctclc-eth0" Dec 16 12:25:53.266667 containerd[1611]: 2025-12-16 12:25:53.165 [INFO][4301] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="bc8a10eec23de7508ca80348243acb68ee5a94b2f1e71a4f235dbda4638249cb" HandleID="k8s-pod-network.bc8a10eec23de7508ca80348243acb68ee5a94b2f1e71a4f235dbda4638249cb" Workload="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--apiserver--79c4b989f5--ctclc-eth0" Dec 16 12:25:53.266667 containerd[1611]: 2025-12-16 12:25:53.165 [INFO][4301] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="bc8a10eec23de7508ca80348243acb68ee5a94b2f1e71a4f235dbda4638249cb" HandleID="k8s-pod-network.bc8a10eec23de7508ca80348243acb68ee5a94b2f1e71a4f235dbda4638249cb" Workload="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--apiserver--79c4b989f5--ctclc-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b220), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4515-1-0-6-95bdd2e3e7", "pod":"calico-apiserver-79c4b989f5-ctclc", "timestamp":"2025-12-16 12:25:53.165362901 +0000 UTC"}, Hostname:"ci-4515-1-0-6-95bdd2e3e7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:25:53.266667 containerd[1611]: 2025-12-16 12:25:53.165 [INFO][4301] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:25:53.266667 containerd[1611]: 2025-12-16 12:25:53.165 [INFO][4301] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:25:53.266667 containerd[1611]: 2025-12-16 12:25:53.165 [INFO][4301] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-6-95bdd2e3e7' Dec 16 12:25:53.266667 containerd[1611]: 2025-12-16 12:25:53.184 [INFO][4301] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.bc8a10eec23de7508ca80348243acb68ee5a94b2f1e71a4f235dbda4638249cb" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:53.266667 containerd[1611]: 2025-12-16 12:25:53.194 [INFO][4301] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:53.266667 containerd[1611]: 2025-12-16 12:25:53.202 [INFO][4301] ipam/ipam.go 511: Trying affinity for 192.168.63.64/26 host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:53.266667 containerd[1611]: 2025-12-16 12:25:53.205 [INFO][4301] ipam/ipam.go 158: Attempting to load block cidr=192.168.63.64/26 host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:53.266667 containerd[1611]: 2025-12-16 12:25:53.209 [INFO][4301] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.63.64/26 host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:53.266667 containerd[1611]: 2025-12-16 12:25:53.209 [INFO][4301] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.63.64/26 handle="k8s-pod-network.bc8a10eec23de7508ca80348243acb68ee5a94b2f1e71a4f235dbda4638249cb" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:53.266667 containerd[1611]: 2025-12-16 12:25:53.212 [INFO][4301] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.bc8a10eec23de7508ca80348243acb68ee5a94b2f1e71a4f235dbda4638249cb Dec 16 12:25:53.266667 containerd[1611]: 2025-12-16 12:25:53.218 [INFO][4301] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.63.64/26 handle="k8s-pod-network.bc8a10eec23de7508ca80348243acb68ee5a94b2f1e71a4f235dbda4638249cb" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:53.266667 containerd[1611]: 2025-12-16 12:25:53.228 [INFO][4301] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.63.66/26] block=192.168.63.64/26 handle="k8s-pod-network.bc8a10eec23de7508ca80348243acb68ee5a94b2f1e71a4f235dbda4638249cb" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:53.266667 containerd[1611]: 2025-12-16 12:25:53.228 [INFO][4301] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.63.66/26] handle="k8s-pod-network.bc8a10eec23de7508ca80348243acb68ee5a94b2f1e71a4f235dbda4638249cb" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:53.266667 containerd[1611]: 2025-12-16 12:25:53.228 [INFO][4301] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:25:53.266667 containerd[1611]: 2025-12-16 12:25:53.228 [INFO][4301] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.63.66/26] IPv6=[] ContainerID="bc8a10eec23de7508ca80348243acb68ee5a94b2f1e71a4f235dbda4638249cb" HandleID="k8s-pod-network.bc8a10eec23de7508ca80348243acb68ee5a94b2f1e71a4f235dbda4638249cb" Workload="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--apiserver--79c4b989f5--ctclc-eth0" Dec 16 12:25:53.268449 containerd[1611]: 2025-12-16 12:25:53.233 [INFO][4279] cni-plugin/k8s.go 418: Populated endpoint ContainerID="bc8a10eec23de7508ca80348243acb68ee5a94b2f1e71a4f235dbda4638249cb" Namespace="calico-apiserver" Pod="calico-apiserver-79c4b989f5-ctclc" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--apiserver--79c4b989f5--ctclc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--6--95bdd2e3e7-k8s-calico--apiserver--79c4b989f5--ctclc-eth0", GenerateName:"calico-apiserver-79c4b989f5-", Namespace:"calico-apiserver", SelfLink:"", UID:"2053ff95-314d-4312-973b-a00f2cf38258", ResourceVersion:"857", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 25, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"79c4b989f5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-6-95bdd2e3e7", ContainerID:"", Pod:"calico-apiserver-79c4b989f5-ctclc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.63.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2964272303b", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:25:53.268449 containerd[1611]: 2025-12-16 12:25:53.233 [INFO][4279] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.63.66/32] ContainerID="bc8a10eec23de7508ca80348243acb68ee5a94b2f1e71a4f235dbda4638249cb" Namespace="calico-apiserver" Pod="calico-apiserver-79c4b989f5-ctclc" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--apiserver--79c4b989f5--ctclc-eth0" Dec 16 12:25:53.268449 containerd[1611]: 2025-12-16 12:25:53.233 [INFO][4279] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali2964272303b ContainerID="bc8a10eec23de7508ca80348243acb68ee5a94b2f1e71a4f235dbda4638249cb" Namespace="calico-apiserver" Pod="calico-apiserver-79c4b989f5-ctclc" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--apiserver--79c4b989f5--ctclc-eth0" Dec 16 12:25:53.268449 containerd[1611]: 2025-12-16 12:25:53.243 [INFO][4279] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="bc8a10eec23de7508ca80348243acb68ee5a94b2f1e71a4f235dbda4638249cb" Namespace="calico-apiserver" Pod="calico-apiserver-79c4b989f5-ctclc" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--apiserver--79c4b989f5--ctclc-eth0" Dec 16 12:25:53.268449 containerd[1611]: 2025-12-16 12:25:53.243 [INFO][4279] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="bc8a10eec23de7508ca80348243acb68ee5a94b2f1e71a4f235dbda4638249cb" Namespace="calico-apiserver" Pod="calico-apiserver-79c4b989f5-ctclc" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--apiserver--79c4b989f5--ctclc-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--6--95bdd2e3e7-k8s-calico--apiserver--79c4b989f5--ctclc-eth0", GenerateName:"calico-apiserver-79c4b989f5-", Namespace:"calico-apiserver", SelfLink:"", UID:"2053ff95-314d-4312-973b-a00f2cf38258", ResourceVersion:"857", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 25, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"79c4b989f5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-6-95bdd2e3e7", ContainerID:"bc8a10eec23de7508ca80348243acb68ee5a94b2f1e71a4f235dbda4638249cb", Pod:"calico-apiserver-79c4b989f5-ctclc", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.63.66/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali2964272303b", MAC:"a2:6e:d6:29:21:ea", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:25:53.268449 containerd[1611]: 2025-12-16 12:25:53.263 [INFO][4279] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="bc8a10eec23de7508ca80348243acb68ee5a94b2f1e71a4f235dbda4638249cb" Namespace="calico-apiserver" Pod="calico-apiserver-79c4b989f5-ctclc" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--apiserver--79c4b989f5--ctclc-eth0" Dec 16 12:25:53.294000 audit[4322]: NETFILTER_CFG table=filter:126 family=2 entries=50 op=nft_register_chain pid=4322 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:25:53.294000 audit[4322]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=28208 a0=3 a1=ffffd4931f80 a2=0 a3=ffff88899fa8 items=0 ppid=3958 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:53.294000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:25:53.322746 containerd[1611]: time="2025-12-16T12:25:53.322699785Z" level=info msg="connecting to shim bc8a10eec23de7508ca80348243acb68ee5a94b2f1e71a4f235dbda4638249cb" address="unix:///run/containerd/s/13b04415123c472ae3750a403b8c75ab9729121c4d6b15a3fe6add9c26f73986" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:25:53.363969 systemd[1]: Started cri-containerd-bc8a10eec23de7508ca80348243acb68ee5a94b2f1e71a4f235dbda4638249cb.scope - libcontainer container bc8a10eec23de7508ca80348243acb68ee5a94b2f1e71a4f235dbda4638249cb. Dec 16 12:25:53.368793 systemd-networkd[1501]: calide91af06e35: Link UP Dec 16 12:25:53.371984 systemd-networkd[1501]: calide91af06e35: Gained carrier Dec 16 12:25:53.401491 containerd[1611]: 2025-12-16 12:25:53.123 [INFO][4275] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--6--95bdd2e3e7-k8s-goldmane--7c778bb748--5kz8n-eth0 goldmane-7c778bb748- calico-system e5eca681-3514-4805-908a-df03e7d148ad 855 0 2025-12-16 12:25:29 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7c778bb748 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4515-1-0-6-95bdd2e3e7 goldmane-7c778bb748-5kz8n eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calide91af06e35 [] [] }} ContainerID="2e485b2cb4af1513efaa6c67527ac33f021e7127d283cf48e930d7e735b52e66" Namespace="calico-system" Pod="goldmane-7c778bb748-5kz8n" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-goldmane--7c778bb748--5kz8n-" Dec 16 12:25:53.401491 containerd[1611]: 2025-12-16 12:25:53.124 [INFO][4275] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="2e485b2cb4af1513efaa6c67527ac33f021e7127d283cf48e930d7e735b52e66" Namespace="calico-system" Pod="goldmane-7c778bb748-5kz8n" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-goldmane--7c778bb748--5kz8n-eth0" Dec 16 12:25:53.401491 containerd[1611]: 2025-12-16 12:25:53.200 [INFO][4300] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="2e485b2cb4af1513efaa6c67527ac33f021e7127d283cf48e930d7e735b52e66" HandleID="k8s-pod-network.2e485b2cb4af1513efaa6c67527ac33f021e7127d283cf48e930d7e735b52e66" Workload="ci--4515--1--0--6--95bdd2e3e7-k8s-goldmane--7c778bb748--5kz8n-eth0" Dec 16 12:25:53.401491 containerd[1611]: 2025-12-16 12:25:53.201 [INFO][4300] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="2e485b2cb4af1513efaa6c67527ac33f021e7127d283cf48e930d7e735b52e66" HandleID="k8s-pod-network.2e485b2cb4af1513efaa6c67527ac33f021e7127d283cf48e930d7e735b52e66" Workload="ci--4515--1--0--6--95bdd2e3e7-k8s-goldmane--7c778bb748--5kz8n-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x4000379720), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4515-1-0-6-95bdd2e3e7", "pod":"goldmane-7c778bb748-5kz8n", "timestamp":"2025-12-16 12:25:53.200850247 +0000 UTC"}, Hostname:"ci-4515-1-0-6-95bdd2e3e7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:25:53.401491 containerd[1611]: 2025-12-16 12:25:53.201 [INFO][4300] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:25:53.401491 containerd[1611]: 2025-12-16 12:25:53.228 [INFO][4300] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:25:53.401491 containerd[1611]: 2025-12-16 12:25:53.228 [INFO][4300] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-6-95bdd2e3e7' Dec 16 12:25:53.401491 containerd[1611]: 2025-12-16 12:25:53.288 [INFO][4300] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.2e485b2cb4af1513efaa6c67527ac33f021e7127d283cf48e930d7e735b52e66" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:53.401491 containerd[1611]: 2025-12-16 12:25:53.307 [INFO][4300] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:53.401491 containerd[1611]: 2025-12-16 12:25:53.320 [INFO][4300] ipam/ipam.go 511: Trying affinity for 192.168.63.64/26 host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:53.401491 containerd[1611]: 2025-12-16 12:25:53.324 [INFO][4300] ipam/ipam.go 158: Attempting to load block cidr=192.168.63.64/26 host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:53.401491 containerd[1611]: 2025-12-16 12:25:53.332 [INFO][4300] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.63.64/26 host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:53.401491 containerd[1611]: 2025-12-16 12:25:53.332 [INFO][4300] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.63.64/26 handle="k8s-pod-network.2e485b2cb4af1513efaa6c67527ac33f021e7127d283cf48e930d7e735b52e66" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:53.401491 containerd[1611]: 2025-12-16 12:25:53.336 [INFO][4300] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.2e485b2cb4af1513efaa6c67527ac33f021e7127d283cf48e930d7e735b52e66 Dec 16 12:25:53.401491 containerd[1611]: 2025-12-16 12:25:53.346 [INFO][4300] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.63.64/26 handle="k8s-pod-network.2e485b2cb4af1513efaa6c67527ac33f021e7127d283cf48e930d7e735b52e66" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:53.401491 containerd[1611]: 2025-12-16 12:25:53.357 [INFO][4300] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.63.67/26] block=192.168.63.64/26 handle="k8s-pod-network.2e485b2cb4af1513efaa6c67527ac33f021e7127d283cf48e930d7e735b52e66" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:53.401491 containerd[1611]: 2025-12-16 12:25:53.357 [INFO][4300] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.63.67/26] handle="k8s-pod-network.2e485b2cb4af1513efaa6c67527ac33f021e7127d283cf48e930d7e735b52e66" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:53.401491 containerd[1611]: 2025-12-16 12:25:53.357 [INFO][4300] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:25:53.401491 containerd[1611]: 2025-12-16 12:25:53.357 [INFO][4300] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.63.67/26] IPv6=[] ContainerID="2e485b2cb4af1513efaa6c67527ac33f021e7127d283cf48e930d7e735b52e66" HandleID="k8s-pod-network.2e485b2cb4af1513efaa6c67527ac33f021e7127d283cf48e930d7e735b52e66" Workload="ci--4515--1--0--6--95bdd2e3e7-k8s-goldmane--7c778bb748--5kz8n-eth0" Dec 16 12:25:53.402244 containerd[1611]: 2025-12-16 12:25:53.362 [INFO][4275] cni-plugin/k8s.go 418: Populated endpoint ContainerID="2e485b2cb4af1513efaa6c67527ac33f021e7127d283cf48e930d7e735b52e66" Namespace="calico-system" Pod="goldmane-7c778bb748-5kz8n" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-goldmane--7c778bb748--5kz8n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--6--95bdd2e3e7-k8s-goldmane--7c778bb748--5kz8n-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"e5eca681-3514-4805-908a-df03e7d148ad", ResourceVersion:"855", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 25, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-6-95bdd2e3e7", ContainerID:"", Pod:"goldmane-7c778bb748-5kz8n", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.63.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calide91af06e35", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:25:53.402244 containerd[1611]: 2025-12-16 12:25:53.363 [INFO][4275] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.63.67/32] ContainerID="2e485b2cb4af1513efaa6c67527ac33f021e7127d283cf48e930d7e735b52e66" Namespace="calico-system" Pod="goldmane-7c778bb748-5kz8n" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-goldmane--7c778bb748--5kz8n-eth0" Dec 16 12:25:53.402244 containerd[1611]: 2025-12-16 12:25:53.363 [INFO][4275] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calide91af06e35 ContainerID="2e485b2cb4af1513efaa6c67527ac33f021e7127d283cf48e930d7e735b52e66" Namespace="calico-system" Pod="goldmane-7c778bb748-5kz8n" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-goldmane--7c778bb748--5kz8n-eth0" Dec 16 12:25:53.402244 containerd[1611]: 2025-12-16 12:25:53.374 [INFO][4275] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="2e485b2cb4af1513efaa6c67527ac33f021e7127d283cf48e930d7e735b52e66" Namespace="calico-system" Pod="goldmane-7c778bb748-5kz8n" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-goldmane--7c778bb748--5kz8n-eth0" Dec 16 12:25:53.402244 containerd[1611]: 2025-12-16 12:25:53.376 [INFO][4275] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="2e485b2cb4af1513efaa6c67527ac33f021e7127d283cf48e930d7e735b52e66" Namespace="calico-system" Pod="goldmane-7c778bb748-5kz8n" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-goldmane--7c778bb748--5kz8n-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--6--95bdd2e3e7-k8s-goldmane--7c778bb748--5kz8n-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"e5eca681-3514-4805-908a-df03e7d148ad", ResourceVersion:"855", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 25, 29, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-6-95bdd2e3e7", ContainerID:"2e485b2cb4af1513efaa6c67527ac33f021e7127d283cf48e930d7e735b52e66", Pod:"goldmane-7c778bb748-5kz8n", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.63.67/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calide91af06e35", MAC:"7a:f4:94:82:43:43", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:25:53.402244 containerd[1611]: 2025-12-16 12:25:53.395 [INFO][4275] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="2e485b2cb4af1513efaa6c67527ac33f021e7127d283cf48e930d7e735b52e66" Namespace="calico-system" Pod="goldmane-7c778bb748-5kz8n" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-goldmane--7c778bb748--5kz8n-eth0" Dec 16 12:25:53.411000 audit: BPF prog-id=209 op=LOAD Dec 16 12:25:53.412769 kernel: kauditd_printk_skb: 237 callbacks suppressed Dec 16 12:25:53.412866 kernel: audit: type=1334 audit(1765887953.411:649): prog-id=209 op=LOAD Dec 16 12:25:53.411000 audit: BPF prog-id=210 op=LOAD Dec 16 12:25:53.415375 kernel: audit: type=1334 audit(1765887953.411:650): prog-id=210 op=LOAD Dec 16 12:25:53.411000 audit[4343]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176180 a2=98 a3=0 items=0 ppid=4330 pid=4343 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:53.419253 kernel: audit: type=1300 audit(1765887953.411:650): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176180 a2=98 a3=0 items=0 ppid=4330 pid=4343 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:53.424049 kernel: audit: type=1327 audit(1765887953.411:650): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263386131306565633233646537353038636138303334383234336163 Dec 16 12:25:53.411000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263386131306565633233646537353038636138303334383234336163 Dec 16 12:25:53.412000 audit: BPF prog-id=210 op=UNLOAD Dec 16 12:25:53.412000 audit[4343]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4330 pid=4343 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:53.425464 kernel: audit: type=1334 audit(1765887953.412:651): prog-id=210 op=UNLOAD Dec 16 12:25:53.412000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263386131306565633233646537353038636138303334383234336163 Dec 16 12:25:53.430600 kernel: audit: type=1300 audit(1765887953.412:651): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4330 pid=4343 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:53.430767 kernel: audit: type=1327 audit(1765887953.412:651): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263386131306565633233646537353038636138303334383234336163 Dec 16 12:25:53.412000 audit: BPF prog-id=211 op=LOAD Dec 16 12:25:53.412000 audit[4343]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=4330 pid=4343 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:53.444659 kernel: audit: type=1334 audit(1765887953.412:652): prog-id=211 op=LOAD Dec 16 12:25:53.445097 kernel: audit: type=1300 audit(1765887953.412:652): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001763e8 a2=98 a3=0 items=0 ppid=4330 pid=4343 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:53.449619 kernel: audit: type=1327 audit(1765887953.412:652): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263386131306565633233646537353038636138303334383234336163 Dec 16 12:25:53.412000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263386131306565633233646537353038636138303334383234336163 Dec 16 12:25:53.413000 audit: BPF prog-id=212 op=LOAD Dec 16 12:25:53.413000 audit[4343]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000176168 a2=98 a3=0 items=0 ppid=4330 pid=4343 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:53.413000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263386131306565633233646537353038636138303334383234336163 Dec 16 12:25:53.413000 audit: BPF prog-id=212 op=UNLOAD Dec 16 12:25:53.413000 audit[4343]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4330 pid=4343 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:53.413000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263386131306565633233646537353038636138303334383234336163 Dec 16 12:25:53.413000 audit: BPF prog-id=211 op=UNLOAD Dec 16 12:25:53.413000 audit[4343]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4330 pid=4343 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:53.413000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263386131306565633233646537353038636138303334383234336163 Dec 16 12:25:53.413000 audit: BPF prog-id=213 op=LOAD Dec 16 12:25:53.413000 audit[4343]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000176648 a2=98 a3=0 items=0 ppid=4330 pid=4343 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:53.413000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6263386131306565633233646537353038636138303334383234336163 Dec 16 12:25:53.434000 audit[4370]: NETFILTER_CFG table=filter:127 family=2 entries=48 op=nft_register_chain pid=4370 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:25:53.434000 audit[4370]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=26368 a0=3 a1=fffff395fd20 a2=0 a3=ffff87948fa8 items=0 ppid=3958 pid=4370 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:53.434000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:25:53.467653 containerd[1611]: time="2025-12-16T12:25:53.467608516Z" level=info msg="connecting to shim 2e485b2cb4af1513efaa6c67527ac33f021e7127d283cf48e930d7e735b52e66" address="unix:///run/containerd/s/1408b18b70695fa658e7f0be99e4a2678c09004b62bbf5004b925683e8a3f947" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:25:53.498907 containerd[1611]: time="2025-12-16T12:25:53.498837037Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-79c4b989f5-ctclc,Uid:2053ff95-314d-4312-973b-a00f2cf38258,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"bc8a10eec23de7508ca80348243acb68ee5a94b2f1e71a4f235dbda4638249cb\"" Dec 16 12:25:53.502769 containerd[1611]: time="2025-12-16T12:25:53.502634343Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:25:53.523553 systemd[1]: Started cri-containerd-2e485b2cb4af1513efaa6c67527ac33f021e7127d283cf48e930d7e735b52e66.scope - libcontainer container 2e485b2cb4af1513efaa6c67527ac33f021e7127d283cf48e930d7e735b52e66. Dec 16 12:25:53.539000 audit: BPF prog-id=214 op=LOAD Dec 16 12:25:53.540000 audit: BPF prog-id=215 op=LOAD Dec 16 12:25:53.540000 audit[4398]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001b0180 a2=98 a3=0 items=0 ppid=4380 pid=4398 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:53.540000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265343835623263623461663135313365666161366336373532376163 Dec 16 12:25:53.540000 audit: BPF prog-id=215 op=UNLOAD Dec 16 12:25:53.540000 audit[4398]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4380 pid=4398 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:53.540000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265343835623263623461663135313365666161366336373532376163 Dec 16 12:25:53.540000 audit: BPF prog-id=216 op=LOAD Dec 16 12:25:53.540000 audit[4398]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001b03e8 a2=98 a3=0 items=0 ppid=4380 pid=4398 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:53.540000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265343835623263623461663135313365666161366336373532376163 Dec 16 12:25:53.541000 audit: BPF prog-id=217 op=LOAD Dec 16 12:25:53.541000 audit[4398]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=40001b0168 a2=98 a3=0 items=0 ppid=4380 pid=4398 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:53.541000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265343835623263623461663135313365666161366336373532376163 Dec 16 12:25:53.541000 audit: BPF prog-id=217 op=UNLOAD Dec 16 12:25:53.541000 audit[4398]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4380 pid=4398 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:53.541000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265343835623263623461663135313365666161366336373532376163 Dec 16 12:25:53.541000 audit: BPF prog-id=216 op=UNLOAD Dec 16 12:25:53.541000 audit[4398]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4380 pid=4398 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:53.541000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265343835623263623461663135313365666161366336373532376163 Dec 16 12:25:53.541000 audit: BPF prog-id=218 op=LOAD Dec 16 12:25:53.541000 audit[4398]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001b0648 a2=98 a3=0 items=0 ppid=4380 pid=4398 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:53.541000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3265343835623263623461663135313365666161366336373532376163 Dec 16 12:25:53.570918 containerd[1611]: time="2025-12-16T12:25:53.570764085Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-5kz8n,Uid:e5eca681-3514-4805-908a-df03e7d148ad,Namespace:calico-system,Attempt:0,} returns sandbox id \"2e485b2cb4af1513efaa6c67527ac33f021e7127d283cf48e930d7e735b52e66\"" Dec 16 12:25:53.847248 containerd[1611]: time="2025-12-16T12:25:53.847005558Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:25:53.848642 containerd[1611]: time="2025-12-16T12:25:53.848570472Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:25:53.848742 containerd[1611]: time="2025-12-16T12:25:53.848689871Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:25:53.848947 kubelet[2858]: E1216 12:25:53.848905 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:25:53.849740 kubelet[2858]: E1216 12:25:53.848953 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:25:53.849740 kubelet[2858]: E1216 12:25:53.849295 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-79c4b989f5-ctclc_calico-apiserver(2053ff95-314d-4312-973b-a00f2cf38258): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:25:53.849740 kubelet[2858]: E1216 12:25:53.849335 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-ctclc" podUID="2053ff95-314d-4312-973b-a00f2cf38258" Dec 16 12:25:53.849889 containerd[1611]: time="2025-12-16T12:25:53.849591228Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:25:54.051313 containerd[1611]: time="2025-12-16T12:25:54.050655588Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6b97fd59-4xf4h,Uid:6c115690-4c4b-4f0d-9563-71f7671c0428,Namespace:calico-system,Attempt:0,}" Dec 16 12:25:54.203980 containerd[1611]: time="2025-12-16T12:25:54.203712655Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:25:54.206509 containerd[1611]: time="2025-12-16T12:25:54.206428325Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:25:54.206866 containerd[1611]: time="2025-12-16T12:25:54.206644404Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:25:54.207243 kubelet[2858]: E1216 12:25:54.207173 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:25:54.207243 kubelet[2858]: E1216 12:25:54.207225 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:25:54.207576 kubelet[2858]: E1216 12:25:54.207488 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-5kz8n_calico-system(e5eca681-3514-4805-908a-df03e7d148ad): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:25:54.207576 kubelet[2858]: E1216 12:25:54.207526 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-5kz8n" podUID="e5eca681-3514-4805-908a-df03e7d148ad" Dec 16 12:25:54.235211 systemd-networkd[1501]: cali7976c8c101f: Link UP Dec 16 12:25:54.235480 systemd-networkd[1501]: cali7976c8c101f: Gained carrier Dec 16 12:25:54.257990 containerd[1611]: 2025-12-16 12:25:54.133 [INFO][4424] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--6--95bdd2e3e7-k8s-calico--kube--controllers--6b97fd59--4xf4h-eth0 calico-kube-controllers-6b97fd59- calico-system 6c115690-4c4b-4f0d-9563-71f7671c0428 854 0 2025-12-16 12:25:32 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:6b97fd59 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4515-1-0-6-95bdd2e3e7 calico-kube-controllers-6b97fd59-4xf4h eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali7976c8c101f [] [] }} ContainerID="ce11357ea2dd8ae4eaeb60d724316e6a98dd9306896e5695d5164a67e558125c" Namespace="calico-system" Pod="calico-kube-controllers-6b97fd59-4xf4h" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--kube--controllers--6b97fd59--4xf4h-" Dec 16 12:25:54.257990 containerd[1611]: 2025-12-16 12:25:54.133 [INFO][4424] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="ce11357ea2dd8ae4eaeb60d724316e6a98dd9306896e5695d5164a67e558125c" Namespace="calico-system" Pod="calico-kube-controllers-6b97fd59-4xf4h" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--kube--controllers--6b97fd59--4xf4h-eth0" Dec 16 12:25:54.257990 containerd[1611]: 2025-12-16 12:25:54.171 [INFO][4436] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="ce11357ea2dd8ae4eaeb60d724316e6a98dd9306896e5695d5164a67e558125c" HandleID="k8s-pod-network.ce11357ea2dd8ae4eaeb60d724316e6a98dd9306896e5695d5164a67e558125c" Workload="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--kube--controllers--6b97fd59--4xf4h-eth0" Dec 16 12:25:54.257990 containerd[1611]: 2025-12-16 12:25:54.171 [INFO][4436] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="ce11357ea2dd8ae4eaeb60d724316e6a98dd9306896e5695d5164a67e558125c" HandleID="k8s-pod-network.ce11357ea2dd8ae4eaeb60d724316e6a98dd9306896e5695d5164a67e558125c" Workload="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--kube--controllers--6b97fd59--4xf4h-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3800), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4515-1-0-6-95bdd2e3e7", "pod":"calico-kube-controllers-6b97fd59-4xf4h", "timestamp":"2025-12-16 12:25:54.171744375 +0000 UTC"}, Hostname:"ci-4515-1-0-6-95bdd2e3e7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:25:54.257990 containerd[1611]: 2025-12-16 12:25:54.172 [INFO][4436] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:25:54.257990 containerd[1611]: 2025-12-16 12:25:54.172 [INFO][4436] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:25:54.257990 containerd[1611]: 2025-12-16 12:25:54.172 [INFO][4436] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-6-95bdd2e3e7' Dec 16 12:25:54.257990 containerd[1611]: 2025-12-16 12:25:54.182 [INFO][4436] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.ce11357ea2dd8ae4eaeb60d724316e6a98dd9306896e5695d5164a67e558125c" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:54.257990 containerd[1611]: 2025-12-16 12:25:54.189 [INFO][4436] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:54.257990 containerd[1611]: 2025-12-16 12:25:54.198 [INFO][4436] ipam/ipam.go 511: Trying affinity for 192.168.63.64/26 host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:54.257990 containerd[1611]: 2025-12-16 12:25:54.200 [INFO][4436] ipam/ipam.go 158: Attempting to load block cidr=192.168.63.64/26 host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:54.257990 containerd[1611]: 2025-12-16 12:25:54.205 [INFO][4436] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.63.64/26 host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:54.257990 containerd[1611]: 2025-12-16 12:25:54.205 [INFO][4436] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.63.64/26 handle="k8s-pod-network.ce11357ea2dd8ae4eaeb60d724316e6a98dd9306896e5695d5164a67e558125c" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:54.257990 containerd[1611]: 2025-12-16 12:25:54.208 [INFO][4436] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.ce11357ea2dd8ae4eaeb60d724316e6a98dd9306896e5695d5164a67e558125c Dec 16 12:25:54.257990 containerd[1611]: 2025-12-16 12:25:54.220 [INFO][4436] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.63.64/26 handle="k8s-pod-network.ce11357ea2dd8ae4eaeb60d724316e6a98dd9306896e5695d5164a67e558125c" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:54.257990 containerd[1611]: 2025-12-16 12:25:54.228 [INFO][4436] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.63.68/26] block=192.168.63.64/26 handle="k8s-pod-network.ce11357ea2dd8ae4eaeb60d724316e6a98dd9306896e5695d5164a67e558125c" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:54.257990 containerd[1611]: 2025-12-16 12:25:54.228 [INFO][4436] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.63.68/26] handle="k8s-pod-network.ce11357ea2dd8ae4eaeb60d724316e6a98dd9306896e5695d5164a67e558125c" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:54.257990 containerd[1611]: 2025-12-16 12:25:54.229 [INFO][4436] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:25:54.257990 containerd[1611]: 2025-12-16 12:25:54.229 [INFO][4436] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.63.68/26] IPv6=[] ContainerID="ce11357ea2dd8ae4eaeb60d724316e6a98dd9306896e5695d5164a67e558125c" HandleID="k8s-pod-network.ce11357ea2dd8ae4eaeb60d724316e6a98dd9306896e5695d5164a67e558125c" Workload="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--kube--controllers--6b97fd59--4xf4h-eth0" Dec 16 12:25:54.259050 containerd[1611]: 2025-12-16 12:25:54.232 [INFO][4424] cni-plugin/k8s.go 418: Populated endpoint ContainerID="ce11357ea2dd8ae4eaeb60d724316e6a98dd9306896e5695d5164a67e558125c" Namespace="calico-system" Pod="calico-kube-controllers-6b97fd59-4xf4h" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--kube--controllers--6b97fd59--4xf4h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--6--95bdd2e3e7-k8s-calico--kube--controllers--6b97fd59--4xf4h-eth0", GenerateName:"calico-kube-controllers-6b97fd59-", Namespace:"calico-system", SelfLink:"", UID:"6c115690-4c4b-4f0d-9563-71f7671c0428", ResourceVersion:"854", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 25, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6b97fd59", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-6-95bdd2e3e7", ContainerID:"", Pod:"calico-kube-controllers-6b97fd59-4xf4h", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.63.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali7976c8c101f", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:25:54.259050 containerd[1611]: 2025-12-16 12:25:54.232 [INFO][4424] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.63.68/32] ContainerID="ce11357ea2dd8ae4eaeb60d724316e6a98dd9306896e5695d5164a67e558125c" Namespace="calico-system" Pod="calico-kube-controllers-6b97fd59-4xf4h" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--kube--controllers--6b97fd59--4xf4h-eth0" Dec 16 12:25:54.259050 containerd[1611]: 2025-12-16 12:25:54.232 [INFO][4424] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7976c8c101f ContainerID="ce11357ea2dd8ae4eaeb60d724316e6a98dd9306896e5695d5164a67e558125c" Namespace="calico-system" Pod="calico-kube-controllers-6b97fd59-4xf4h" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--kube--controllers--6b97fd59--4xf4h-eth0" Dec 16 12:25:54.259050 containerd[1611]: 2025-12-16 12:25:54.235 [INFO][4424] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="ce11357ea2dd8ae4eaeb60d724316e6a98dd9306896e5695d5164a67e558125c" Namespace="calico-system" Pod="calico-kube-controllers-6b97fd59-4xf4h" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--kube--controllers--6b97fd59--4xf4h-eth0" Dec 16 12:25:54.259050 containerd[1611]: 2025-12-16 12:25:54.236 [INFO][4424] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="ce11357ea2dd8ae4eaeb60d724316e6a98dd9306896e5695d5164a67e558125c" Namespace="calico-system" Pod="calico-kube-controllers-6b97fd59-4xf4h" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--kube--controllers--6b97fd59--4xf4h-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--6--95bdd2e3e7-k8s-calico--kube--controllers--6b97fd59--4xf4h-eth0", GenerateName:"calico-kube-controllers-6b97fd59-", Namespace:"calico-system", SelfLink:"", UID:"6c115690-4c4b-4f0d-9563-71f7671c0428", ResourceVersion:"854", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 25, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"6b97fd59", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-6-95bdd2e3e7", ContainerID:"ce11357ea2dd8ae4eaeb60d724316e6a98dd9306896e5695d5164a67e558125c", Pod:"calico-kube-controllers-6b97fd59-4xf4h", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.63.68/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali7976c8c101f", MAC:"76:0f:6d:eb:b9:4f", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:25:54.259050 containerd[1611]: 2025-12-16 12:25:54.254 [INFO][4424] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="ce11357ea2dd8ae4eaeb60d724316e6a98dd9306896e5695d5164a67e558125c" Namespace="calico-system" Pod="calico-kube-controllers-6b97fd59-4xf4h" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--kube--controllers--6b97fd59--4xf4h-eth0" Dec 16 12:25:54.282000 audit[4450]: NETFILTER_CFG table=filter:128 family=2 entries=44 op=nft_register_chain pid=4450 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:25:54.282000 audit[4450]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=21952 a0=3 a1=ffffd72493c0 a2=0 a3=ffffb2a27fa8 items=0 ppid=3958 pid=4450 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:54.282000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:25:54.287398 containerd[1611]: time="2025-12-16T12:25:54.287321982Z" level=info msg="connecting to shim ce11357ea2dd8ae4eaeb60d724316e6a98dd9306896e5695d5164a67e558125c" address="unix:///run/containerd/s/caa63f6dcf85b10f317a91a06da38951e1beb7c36e7eacb62c9ffc3b9c0cef77" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:25:54.326224 kubelet[2858]: E1216 12:25:54.326175 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-5kz8n" podUID="e5eca681-3514-4805-908a-df03e7d148ad" Dec 16 12:25:54.338749 systemd[1]: Started cri-containerd-ce11357ea2dd8ae4eaeb60d724316e6a98dd9306896e5695d5164a67e558125c.scope - libcontainer container ce11357ea2dd8ae4eaeb60d724316e6a98dd9306896e5695d5164a67e558125c. Dec 16 12:25:54.340506 kubelet[2858]: E1216 12:25:54.340447 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-ctclc" podUID="2053ff95-314d-4312-973b-a00f2cf38258" Dec 16 12:25:54.358000 audit: BPF prog-id=219 op=LOAD Dec 16 12:25:54.359000 audit: BPF prog-id=220 op=LOAD Dec 16 12:25:54.359000 audit[4471]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=4460 pid=4471 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:54.359000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6365313133353765613264643861653465616562363064373234333136 Dec 16 12:25:54.359000 audit: BPF prog-id=220 op=UNLOAD Dec 16 12:25:54.359000 audit[4471]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4460 pid=4471 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:54.359000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6365313133353765613264643861653465616562363064373234333136 Dec 16 12:25:54.359000 audit: BPF prog-id=221 op=LOAD Dec 16 12:25:54.359000 audit[4471]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=4460 pid=4471 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:54.359000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6365313133353765613264643861653465616562363064373234333136 Dec 16 12:25:54.359000 audit: BPF prog-id=222 op=LOAD Dec 16 12:25:54.359000 audit[4471]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=4460 pid=4471 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:54.359000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6365313133353765613264643861653465616562363064373234333136 Dec 16 12:25:54.359000 audit: BPF prog-id=222 op=UNLOAD Dec 16 12:25:54.359000 audit[4471]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4460 pid=4471 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:54.359000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6365313133353765613264643861653465616562363064373234333136 Dec 16 12:25:54.359000 audit: BPF prog-id=221 op=UNLOAD Dec 16 12:25:54.359000 audit[4471]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4460 pid=4471 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:54.359000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6365313133353765613264643861653465616562363064373234333136 Dec 16 12:25:54.359000 audit: BPF prog-id=223 op=LOAD Dec 16 12:25:54.359000 audit[4471]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=4460 pid=4471 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:54.359000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6365313133353765613264643861653465616562363064373234333136 Dec 16 12:25:54.481000 audit[4491]: NETFILTER_CFG table=filter:129 family=2 entries=20 op=nft_register_rule pid=4491 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:54.481000 audit[4491]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffcac864b0 a2=0 a3=1 items=0 ppid=2994 pid=4491 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:54.481000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:54.488000 audit[4491]: NETFILTER_CFG table=nat:130 family=2 entries=14 op=nft_register_rule pid=4491 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:54.488000 audit[4491]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffcac864b0 a2=0 a3=1 items=0 ppid=2994 pid=4491 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:54.488000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:54.508507 containerd[1611]: time="2025-12-16T12:25:54.508453915Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-6b97fd59-4xf4h,Uid:6c115690-4c4b-4f0d-9563-71f7671c0428,Namespace:calico-system,Attempt:0,} returns sandbox id \"ce11357ea2dd8ae4eaeb60d724316e6a98dd9306896e5695d5164a67e558125c\"" Dec 16 12:25:54.511520 containerd[1611]: time="2025-12-16T12:25:54.511429024Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:25:54.827975 containerd[1611]: time="2025-12-16T12:25:54.827590960Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:25:54.829962 containerd[1611]: time="2025-12-16T12:25:54.829786792Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:25:54.830230 kubelet[2858]: E1216 12:25:54.830164 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:25:54.830230 kubelet[2858]: E1216 12:25:54.830221 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:25:54.830422 containerd[1611]: time="2025-12-16T12:25:54.829899431Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:25:54.830490 kubelet[2858]: E1216 12:25:54.830421 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-6b97fd59-4xf4h_calico-system(6c115690-4c4b-4f0d-9563-71f7671c0428): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:25:54.830738 kubelet[2858]: E1216 12:25:54.830650 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b97fd59-4xf4h" podUID="6c115690-4c4b-4f0d-9563-71f7671c0428" Dec 16 12:25:54.862660 systemd-networkd[1501]: cali2964272303b: Gained IPv6LL Dec 16 12:25:55.044358 containerd[1611]: time="2025-12-16T12:25:55.044304271Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-psf64,Uid:3de277ef-70c9-4b08-8b83-d92a9680c7b8,Namespace:calico-system,Attempt:0,}" Dec 16 12:25:55.202451 systemd-networkd[1501]: cali270ebc9b424: Link UP Dec 16 12:25:55.204873 systemd-networkd[1501]: cali270ebc9b424: Gained carrier Dec 16 12:25:55.226188 containerd[1611]: 2025-12-16 12:25:55.100 [INFO][4500] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--6--95bdd2e3e7-k8s-csi--node--driver--psf64-eth0 csi-node-driver- calico-system 3de277ef-70c9-4b08-8b83-d92a9680c7b8 751 0 2025-12-16 12:25:32 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:9d99788f7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4515-1-0-6-95bdd2e3e7 csi-node-driver-psf64 eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali270ebc9b424 [] [] }} ContainerID="252cd2255e9bf841d62599594b3441d80197fdd26e3a2836b770715e4009bf43" Namespace="calico-system" Pod="csi-node-driver-psf64" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-csi--node--driver--psf64-" Dec 16 12:25:55.226188 containerd[1611]: 2025-12-16 12:25:55.101 [INFO][4500] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="252cd2255e9bf841d62599594b3441d80197fdd26e3a2836b770715e4009bf43" Namespace="calico-system" Pod="csi-node-driver-psf64" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-csi--node--driver--psf64-eth0" Dec 16 12:25:55.226188 containerd[1611]: 2025-12-16 12:25:55.128 [INFO][4511] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="252cd2255e9bf841d62599594b3441d80197fdd26e3a2836b770715e4009bf43" HandleID="k8s-pod-network.252cd2255e9bf841d62599594b3441d80197fdd26e3a2836b770715e4009bf43" Workload="ci--4515--1--0--6--95bdd2e3e7-k8s-csi--node--driver--psf64-eth0" Dec 16 12:25:55.226188 containerd[1611]: 2025-12-16 12:25:55.129 [INFO][4511] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="252cd2255e9bf841d62599594b3441d80197fdd26e3a2836b770715e4009bf43" HandleID="k8s-pod-network.252cd2255e9bf841d62599594b3441d80197fdd26e3a2836b770715e4009bf43" Workload="ci--4515--1--0--6--95bdd2e3e7-k8s-csi--node--driver--psf64-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2fe0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4515-1-0-6-95bdd2e3e7", "pod":"csi-node-driver-psf64", "timestamp":"2025-12-16 12:25:55.128853398 +0000 UTC"}, Hostname:"ci-4515-1-0-6-95bdd2e3e7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:25:55.226188 containerd[1611]: 2025-12-16 12:25:55.129 [INFO][4511] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:25:55.226188 containerd[1611]: 2025-12-16 12:25:55.129 [INFO][4511] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:25:55.226188 containerd[1611]: 2025-12-16 12:25:55.129 [INFO][4511] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-6-95bdd2e3e7' Dec 16 12:25:55.226188 containerd[1611]: 2025-12-16 12:25:55.142 [INFO][4511] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.252cd2255e9bf841d62599594b3441d80197fdd26e3a2836b770715e4009bf43" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:55.226188 containerd[1611]: 2025-12-16 12:25:55.151 [INFO][4511] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:55.226188 containerd[1611]: 2025-12-16 12:25:55.162 [INFO][4511] ipam/ipam.go 511: Trying affinity for 192.168.63.64/26 host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:55.226188 containerd[1611]: 2025-12-16 12:25:55.166 [INFO][4511] ipam/ipam.go 158: Attempting to load block cidr=192.168.63.64/26 host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:55.226188 containerd[1611]: 2025-12-16 12:25:55.173 [INFO][4511] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.63.64/26 host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:55.226188 containerd[1611]: 2025-12-16 12:25:55.173 [INFO][4511] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.63.64/26 handle="k8s-pod-network.252cd2255e9bf841d62599594b3441d80197fdd26e3a2836b770715e4009bf43" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:55.226188 containerd[1611]: 2025-12-16 12:25:55.177 [INFO][4511] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.252cd2255e9bf841d62599594b3441d80197fdd26e3a2836b770715e4009bf43 Dec 16 12:25:55.226188 containerd[1611]: 2025-12-16 12:25:55.184 [INFO][4511] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.63.64/26 handle="k8s-pod-network.252cd2255e9bf841d62599594b3441d80197fdd26e3a2836b770715e4009bf43" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:55.226188 containerd[1611]: 2025-12-16 12:25:55.192 [INFO][4511] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.63.69/26] block=192.168.63.64/26 handle="k8s-pod-network.252cd2255e9bf841d62599594b3441d80197fdd26e3a2836b770715e4009bf43" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:55.226188 containerd[1611]: 2025-12-16 12:25:55.193 [INFO][4511] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.63.69/26] handle="k8s-pod-network.252cd2255e9bf841d62599594b3441d80197fdd26e3a2836b770715e4009bf43" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:55.226188 containerd[1611]: 2025-12-16 12:25:55.193 [INFO][4511] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:25:55.226188 containerd[1611]: 2025-12-16 12:25:55.193 [INFO][4511] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.63.69/26] IPv6=[] ContainerID="252cd2255e9bf841d62599594b3441d80197fdd26e3a2836b770715e4009bf43" HandleID="k8s-pod-network.252cd2255e9bf841d62599594b3441d80197fdd26e3a2836b770715e4009bf43" Workload="ci--4515--1--0--6--95bdd2e3e7-k8s-csi--node--driver--psf64-eth0" Dec 16 12:25:55.228000 containerd[1611]: 2025-12-16 12:25:55.197 [INFO][4500] cni-plugin/k8s.go 418: Populated endpoint ContainerID="252cd2255e9bf841d62599594b3441d80197fdd26e3a2836b770715e4009bf43" Namespace="calico-system" Pod="csi-node-driver-psf64" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-csi--node--driver--psf64-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--6--95bdd2e3e7-k8s-csi--node--driver--psf64-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"3de277ef-70c9-4b08-8b83-d92a9680c7b8", ResourceVersion:"751", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 25, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-6-95bdd2e3e7", ContainerID:"", Pod:"csi-node-driver-psf64", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.63.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali270ebc9b424", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:25:55.228000 containerd[1611]: 2025-12-16 12:25:55.197 [INFO][4500] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.63.69/32] ContainerID="252cd2255e9bf841d62599594b3441d80197fdd26e3a2836b770715e4009bf43" Namespace="calico-system" Pod="csi-node-driver-psf64" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-csi--node--driver--psf64-eth0" Dec 16 12:25:55.228000 containerd[1611]: 2025-12-16 12:25:55.197 [INFO][4500] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali270ebc9b424 ContainerID="252cd2255e9bf841d62599594b3441d80197fdd26e3a2836b770715e4009bf43" Namespace="calico-system" Pod="csi-node-driver-psf64" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-csi--node--driver--psf64-eth0" Dec 16 12:25:55.228000 containerd[1611]: 2025-12-16 12:25:55.199 [INFO][4500] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="252cd2255e9bf841d62599594b3441d80197fdd26e3a2836b770715e4009bf43" Namespace="calico-system" Pod="csi-node-driver-psf64" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-csi--node--driver--psf64-eth0" Dec 16 12:25:55.228000 containerd[1611]: 2025-12-16 12:25:55.200 [INFO][4500] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="252cd2255e9bf841d62599594b3441d80197fdd26e3a2836b770715e4009bf43" Namespace="calico-system" Pod="csi-node-driver-psf64" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-csi--node--driver--psf64-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--6--95bdd2e3e7-k8s-csi--node--driver--psf64-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"3de277ef-70c9-4b08-8b83-d92a9680c7b8", ResourceVersion:"751", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 25, 32, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-6-95bdd2e3e7", ContainerID:"252cd2255e9bf841d62599594b3441d80197fdd26e3a2836b770715e4009bf43", Pod:"csi-node-driver-psf64", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.63.69/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali270ebc9b424", MAC:"56:12:87:40:54:2c", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:25:55.228000 containerd[1611]: 2025-12-16 12:25:55.223 [INFO][4500] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="252cd2255e9bf841d62599594b3441d80197fdd26e3a2836b770715e4009bf43" Namespace="calico-system" Pod="csi-node-driver-psf64" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-csi--node--driver--psf64-eth0" Dec 16 12:25:55.247751 systemd-networkd[1501]: calide91af06e35: Gained IPv6LL Dec 16 12:25:55.250000 audit[4525]: NETFILTER_CFG table=filter:131 family=2 entries=54 op=nft_register_chain pid=4525 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:25:55.250000 audit[4525]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=25992 a0=3 a1=ffffdf8a6930 a2=0 a3=ffffa283efa8 items=0 ppid=3958 pid=4525 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:55.250000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:25:55.268351 containerd[1611]: time="2025-12-16T12:25:55.266612888Z" level=info msg="connecting to shim 252cd2255e9bf841d62599594b3441d80197fdd26e3a2836b770715e4009bf43" address="unix:///run/containerd/s/618847256270abca7d01fd9a7f50dfe312eb98920297347acf92c91969bf100f" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:25:55.328514 systemd[1]: Started cri-containerd-252cd2255e9bf841d62599594b3441d80197fdd26e3a2836b770715e4009bf43.scope - libcontainer container 252cd2255e9bf841d62599594b3441d80197fdd26e3a2836b770715e4009bf43. Dec 16 12:25:55.348000 audit: BPF prog-id=224 op=LOAD Dec 16 12:25:55.348000 audit: BPF prog-id=225 op=LOAD Dec 16 12:25:55.348000 audit[4549]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=4535 pid=4549 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:55.348000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235326364323235356539626638343164363235393935393462333434 Dec 16 12:25:55.349000 audit: BPF prog-id=225 op=UNLOAD Dec 16 12:25:55.349000 audit[4549]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4535 pid=4549 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:55.349000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235326364323235356539626638343164363235393935393462333434 Dec 16 12:25:55.350000 audit: BPF prog-id=226 op=LOAD Dec 16 12:25:55.350000 audit[4549]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=4535 pid=4549 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:55.350000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235326364323235356539626638343164363235393935393462333434 Dec 16 12:25:55.350000 audit: BPF prog-id=227 op=LOAD Dec 16 12:25:55.350000 audit[4549]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=4535 pid=4549 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:55.350000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235326364323235356539626638343164363235393935393462333434 Dec 16 12:25:55.350000 audit: BPF prog-id=227 op=UNLOAD Dec 16 12:25:55.350000 audit[4549]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4535 pid=4549 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:55.350000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235326364323235356539626638343164363235393935393462333434 Dec 16 12:25:55.350000 audit: BPF prog-id=226 op=UNLOAD Dec 16 12:25:55.350000 audit[4549]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4535 pid=4549 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:55.350000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235326364323235356539626638343164363235393935393462333434 Dec 16 12:25:55.350000 audit: BPF prog-id=228 op=LOAD Dec 16 12:25:55.350000 audit[4549]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=4535 pid=4549 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:55.350000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235326364323235356539626638343164363235393935393462333434 Dec 16 12:25:55.358824 kubelet[2858]: E1216 12:25:55.358759 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b97fd59-4xf4h" podUID="6c115690-4c4b-4f0d-9563-71f7671c0428" Dec 16 12:25:55.360478 kubelet[2858]: E1216 12:25:55.358782 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-ctclc" podUID="2053ff95-314d-4312-973b-a00f2cf38258" Dec 16 12:25:55.361567 kubelet[2858]: E1216 12:25:55.361437 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-5kz8n" podUID="e5eca681-3514-4805-908a-df03e7d148ad" Dec 16 12:25:55.375664 systemd-networkd[1501]: cali7976c8c101f: Gained IPv6LL Dec 16 12:25:55.403888 containerd[1611]: time="2025-12-16T12:25:55.403831021Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-psf64,Uid:3de277ef-70c9-4b08-8b83-d92a9680c7b8,Namespace:calico-system,Attempt:0,} returns sandbox id \"252cd2255e9bf841d62599594b3441d80197fdd26e3a2836b770715e4009bf43\"" Dec 16 12:25:55.410028 containerd[1611]: time="2025-12-16T12:25:55.409851718Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:25:55.502000 audit[4581]: NETFILTER_CFG table=filter:132 family=2 entries=20 op=nft_register_rule pid=4581 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:55.502000 audit[4581]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffe1653990 a2=0 a3=1 items=0 ppid=2994 pid=4581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:55.502000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:55.509000 audit[4581]: NETFILTER_CFG table=nat:133 family=2 entries=14 op=nft_register_rule pid=4581 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:55.509000 audit[4581]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffe1653990 a2=0 a3=1 items=0 ppid=2994 pid=4581 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:55.509000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:55.744523 containerd[1611]: time="2025-12-16T12:25:55.744410761Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:25:55.746144 containerd[1611]: time="2025-12-16T12:25:55.745991195Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:25:55.746318 containerd[1611]: time="2025-12-16T12:25:55.746110114Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:25:55.746535 kubelet[2858]: E1216 12:25:55.746445 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:25:55.746535 kubelet[2858]: E1216 12:25:55.746501 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:25:55.746701 kubelet[2858]: E1216 12:25:55.746594 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-psf64_calico-system(3de277ef-70c9-4b08-8b83-d92a9680c7b8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:25:55.749771 containerd[1611]: time="2025-12-16T12:25:55.749715261Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:25:56.043425 containerd[1611]: time="2025-12-16T12:25:56.043386416Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-79c4b989f5-tzbch,Uid:91bf0287-40fc-4bcc-aa76-a7888b9a94ef,Namespace:calico-apiserver,Attempt:0,}" Dec 16 12:25:56.097726 containerd[1611]: time="2025-12-16T12:25:56.097547458Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:25:56.099077 containerd[1611]: time="2025-12-16T12:25:56.098920373Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:25:56.099817 containerd[1611]: time="2025-12-16T12:25:56.098971173Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:25:56.100047 kubelet[2858]: E1216 12:25:56.100004 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:25:56.100152 kubelet[2858]: E1216 12:25:56.100055 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:25:56.101285 kubelet[2858]: E1216 12:25:56.100354 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-psf64_calico-system(3de277ef-70c9-4b08-8b83-d92a9680c7b8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:25:56.101285 kubelet[2858]: E1216 12:25:56.100406 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-psf64" podUID="3de277ef-70c9-4b08-8b83-d92a9680c7b8" Dec 16 12:25:56.222194 systemd-networkd[1501]: calia313dd5d69e: Link UP Dec 16 12:25:56.223803 systemd-networkd[1501]: calia313dd5d69e: Gained carrier Dec 16 12:25:56.244967 containerd[1611]: 2025-12-16 12:25:56.118 [INFO][4583] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--6--95bdd2e3e7-k8s-calico--apiserver--79c4b989f5--tzbch-eth0 calico-apiserver-79c4b989f5- calico-apiserver 91bf0287-40fc-4bcc-aa76-a7888b9a94ef 858 0 2025-12-16 12:25:24 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:79c4b989f5 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4515-1-0-6-95bdd2e3e7 calico-apiserver-79c4b989f5-tzbch eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] calia313dd5d69e [] [] }} ContainerID="4140dddf8068e2fa678aa54ab207edced9b8fd98225947510e5b4fb6997e02e4" Namespace="calico-apiserver" Pod="calico-apiserver-79c4b989f5-tzbch" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--apiserver--79c4b989f5--tzbch-" Dec 16 12:25:56.244967 containerd[1611]: 2025-12-16 12:25:56.118 [INFO][4583] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="4140dddf8068e2fa678aa54ab207edced9b8fd98225947510e5b4fb6997e02e4" Namespace="calico-apiserver" Pod="calico-apiserver-79c4b989f5-tzbch" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--apiserver--79c4b989f5--tzbch-eth0" Dec 16 12:25:56.244967 containerd[1611]: 2025-12-16 12:25:56.156 [INFO][4595] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="4140dddf8068e2fa678aa54ab207edced9b8fd98225947510e5b4fb6997e02e4" HandleID="k8s-pod-network.4140dddf8068e2fa678aa54ab207edced9b8fd98225947510e5b4fb6997e02e4" Workload="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--apiserver--79c4b989f5--tzbch-eth0" Dec 16 12:25:56.244967 containerd[1611]: 2025-12-16 12:25:56.156 [INFO][4595] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="4140dddf8068e2fa678aa54ab207edced9b8fd98225947510e5b4fb6997e02e4" HandleID="k8s-pod-network.4140dddf8068e2fa678aa54ab207edced9b8fd98225947510e5b4fb6997e02e4" Workload="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--apiserver--79c4b989f5--tzbch-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b200), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4515-1-0-6-95bdd2e3e7", "pod":"calico-apiserver-79c4b989f5-tzbch", "timestamp":"2025-12-16 12:25:56.156328363 +0000 UTC"}, Hostname:"ci-4515-1-0-6-95bdd2e3e7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:25:56.244967 containerd[1611]: 2025-12-16 12:25:56.156 [INFO][4595] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:25:56.244967 containerd[1611]: 2025-12-16 12:25:56.157 [INFO][4595] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:25:56.244967 containerd[1611]: 2025-12-16 12:25:56.157 [INFO][4595] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-6-95bdd2e3e7' Dec 16 12:25:56.244967 containerd[1611]: 2025-12-16 12:25:56.171 [INFO][4595] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.4140dddf8068e2fa678aa54ab207edced9b8fd98225947510e5b4fb6997e02e4" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:56.244967 containerd[1611]: 2025-12-16 12:25:56.177 [INFO][4595] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:56.244967 containerd[1611]: 2025-12-16 12:25:56.183 [INFO][4595] ipam/ipam.go 511: Trying affinity for 192.168.63.64/26 host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:56.244967 containerd[1611]: 2025-12-16 12:25:56.186 [INFO][4595] ipam/ipam.go 158: Attempting to load block cidr=192.168.63.64/26 host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:56.244967 containerd[1611]: 2025-12-16 12:25:56.190 [INFO][4595] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.63.64/26 host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:56.244967 containerd[1611]: 2025-12-16 12:25:56.190 [INFO][4595] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.63.64/26 handle="k8s-pod-network.4140dddf8068e2fa678aa54ab207edced9b8fd98225947510e5b4fb6997e02e4" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:56.244967 containerd[1611]: 2025-12-16 12:25:56.191 [INFO][4595] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.4140dddf8068e2fa678aa54ab207edced9b8fd98225947510e5b4fb6997e02e4 Dec 16 12:25:56.244967 containerd[1611]: 2025-12-16 12:25:56.197 [INFO][4595] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.63.64/26 handle="k8s-pod-network.4140dddf8068e2fa678aa54ab207edced9b8fd98225947510e5b4fb6997e02e4" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:56.244967 containerd[1611]: 2025-12-16 12:25:56.206 [INFO][4595] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.63.70/26] block=192.168.63.64/26 handle="k8s-pod-network.4140dddf8068e2fa678aa54ab207edced9b8fd98225947510e5b4fb6997e02e4" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:56.244967 containerd[1611]: 2025-12-16 12:25:56.206 [INFO][4595] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.63.70/26] handle="k8s-pod-network.4140dddf8068e2fa678aa54ab207edced9b8fd98225947510e5b4fb6997e02e4" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:56.244967 containerd[1611]: 2025-12-16 12:25:56.206 [INFO][4595] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:25:56.244967 containerd[1611]: 2025-12-16 12:25:56.207 [INFO][4595] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.63.70/26] IPv6=[] ContainerID="4140dddf8068e2fa678aa54ab207edced9b8fd98225947510e5b4fb6997e02e4" HandleID="k8s-pod-network.4140dddf8068e2fa678aa54ab207edced9b8fd98225947510e5b4fb6997e02e4" Workload="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--apiserver--79c4b989f5--tzbch-eth0" Dec 16 12:25:56.245799 containerd[1611]: 2025-12-16 12:25:56.213 [INFO][4583] cni-plugin/k8s.go 418: Populated endpoint ContainerID="4140dddf8068e2fa678aa54ab207edced9b8fd98225947510e5b4fb6997e02e4" Namespace="calico-apiserver" Pod="calico-apiserver-79c4b989f5-tzbch" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--apiserver--79c4b989f5--tzbch-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--6--95bdd2e3e7-k8s-calico--apiserver--79c4b989f5--tzbch-eth0", GenerateName:"calico-apiserver-79c4b989f5-", Namespace:"calico-apiserver", SelfLink:"", UID:"91bf0287-40fc-4bcc-aa76-a7888b9a94ef", ResourceVersion:"858", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 25, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"79c4b989f5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-6-95bdd2e3e7", ContainerID:"", Pod:"calico-apiserver-79c4b989f5-tzbch", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.63.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia313dd5d69e", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:25:56.245799 containerd[1611]: 2025-12-16 12:25:56.213 [INFO][4583] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.63.70/32] ContainerID="4140dddf8068e2fa678aa54ab207edced9b8fd98225947510e5b4fb6997e02e4" Namespace="calico-apiserver" Pod="calico-apiserver-79c4b989f5-tzbch" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--apiserver--79c4b989f5--tzbch-eth0" Dec 16 12:25:56.245799 containerd[1611]: 2025-12-16 12:25:56.213 [INFO][4583] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia313dd5d69e ContainerID="4140dddf8068e2fa678aa54ab207edced9b8fd98225947510e5b4fb6997e02e4" Namespace="calico-apiserver" Pod="calico-apiserver-79c4b989f5-tzbch" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--apiserver--79c4b989f5--tzbch-eth0" Dec 16 12:25:56.245799 containerd[1611]: 2025-12-16 12:25:56.224 [INFO][4583] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="4140dddf8068e2fa678aa54ab207edced9b8fd98225947510e5b4fb6997e02e4" Namespace="calico-apiserver" Pod="calico-apiserver-79c4b989f5-tzbch" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--apiserver--79c4b989f5--tzbch-eth0" Dec 16 12:25:56.245799 containerd[1611]: 2025-12-16 12:25:56.225 [INFO][4583] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="4140dddf8068e2fa678aa54ab207edced9b8fd98225947510e5b4fb6997e02e4" Namespace="calico-apiserver" Pod="calico-apiserver-79c4b989f5-tzbch" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--apiserver--79c4b989f5--tzbch-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--6--95bdd2e3e7-k8s-calico--apiserver--79c4b989f5--tzbch-eth0", GenerateName:"calico-apiserver-79c4b989f5-", Namespace:"calico-apiserver", SelfLink:"", UID:"91bf0287-40fc-4bcc-aa76-a7888b9a94ef", ResourceVersion:"858", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 25, 24, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"79c4b989f5", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-6-95bdd2e3e7", ContainerID:"4140dddf8068e2fa678aa54ab207edced9b8fd98225947510e5b4fb6997e02e4", Pod:"calico-apiserver-79c4b989f5-tzbch", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.63.70/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"calia313dd5d69e", MAC:"1e:82:dd:b3:79:c7", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:25:56.245799 containerd[1611]: 2025-12-16 12:25:56.240 [INFO][4583] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="4140dddf8068e2fa678aa54ab207edced9b8fd98225947510e5b4fb6997e02e4" Namespace="calico-apiserver" Pod="calico-apiserver-79c4b989f5-tzbch" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-calico--apiserver--79c4b989f5--tzbch-eth0" Dec 16 12:25:56.268000 audit[4608]: NETFILTER_CFG table=filter:134 family=2 entries=49 op=nft_register_chain pid=4608 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:25:56.268000 audit[4608]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=25436 a0=3 a1=ffffd8ae7850 a2=0 a3=ffff92245fa8 items=0 ppid=3958 pid=4608 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:56.268000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:25:56.276238 containerd[1611]: time="2025-12-16T12:25:56.276164685Z" level=info msg="connecting to shim 4140dddf8068e2fa678aa54ab207edced9b8fd98225947510e5b4fb6997e02e4" address="unix:///run/containerd/s/7b46ab6837f4a6d548a977e3710edb6c83e2f71a13b4348437b7215680799800" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:25:56.308649 systemd[1]: Started cri-containerd-4140dddf8068e2fa678aa54ab207edced9b8fd98225947510e5b4fb6997e02e4.scope - libcontainer container 4140dddf8068e2fa678aa54ab207edced9b8fd98225947510e5b4fb6997e02e4. Dec 16 12:25:56.335000 audit: BPF prog-id=229 op=LOAD Dec 16 12:25:56.336000 audit: BPF prog-id=230 op=LOAD Dec 16 12:25:56.336000 audit[4628]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000138180 a2=98 a3=0 items=0 ppid=4617 pid=4628 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:56.336000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3431343064646466383036386532666136373861613534616232303765 Dec 16 12:25:56.336000 audit: BPF prog-id=230 op=UNLOAD Dec 16 12:25:56.336000 audit[4628]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4617 pid=4628 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:56.336000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3431343064646466383036386532666136373861613534616232303765 Dec 16 12:25:56.337000 audit: BPF prog-id=231 op=LOAD Dec 16 12:25:56.337000 audit[4628]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001383e8 a2=98 a3=0 items=0 ppid=4617 pid=4628 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:56.337000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3431343064646466383036386532666136373861613534616232303765 Dec 16 12:25:56.337000 audit: BPF prog-id=232 op=LOAD Dec 16 12:25:56.337000 audit[4628]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000138168 a2=98 a3=0 items=0 ppid=4617 pid=4628 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:56.337000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3431343064646466383036386532666136373861613534616232303765 Dec 16 12:25:56.337000 audit: BPF prog-id=232 op=UNLOAD Dec 16 12:25:56.337000 audit[4628]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4617 pid=4628 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:56.337000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3431343064646466383036386532666136373861613534616232303765 Dec 16 12:25:56.337000 audit: BPF prog-id=231 op=UNLOAD Dec 16 12:25:56.337000 audit[4628]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4617 pid=4628 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:56.337000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3431343064646466383036386532666136373861613534616232303765 Dec 16 12:25:56.337000 audit: BPF prog-id=233 op=LOAD Dec 16 12:25:56.337000 audit[4628]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000138648 a2=98 a3=0 items=0 ppid=4617 pid=4628 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:56.337000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3431343064646466383036386532666136373861613534616232303765 Dec 16 12:25:56.362862 kubelet[2858]: E1216 12:25:56.361899 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b97fd59-4xf4h" podUID="6c115690-4c4b-4f0d-9563-71f7671c0428" Dec 16 12:25:56.366549 kubelet[2858]: E1216 12:25:56.366497 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-psf64" podUID="3de277ef-70c9-4b08-8b83-d92a9680c7b8" Dec 16 12:25:56.384564 containerd[1611]: time="2025-12-16T12:25:56.384525248Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-79c4b989f5-tzbch,Uid:91bf0287-40fc-4bcc-aa76-a7888b9a94ef,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"4140dddf8068e2fa678aa54ab207edced9b8fd98225947510e5b4fb6997e02e4\"" Dec 16 12:25:56.388050 containerd[1611]: time="2025-12-16T12:25:56.387727677Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:25:56.718770 systemd-networkd[1501]: cali270ebc9b424: Gained IPv6LL Dec 16 12:25:56.739311 containerd[1611]: time="2025-12-16T12:25:56.739178511Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:25:56.740646 containerd[1611]: time="2025-12-16T12:25:56.740561506Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:25:56.740793 containerd[1611]: time="2025-12-16T12:25:56.740679225Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:25:56.741143 kubelet[2858]: E1216 12:25:56.741049 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:25:56.741405 kubelet[2858]: E1216 12:25:56.741356 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:25:56.741877 kubelet[2858]: E1216 12:25:56.741792 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-79c4b989f5-tzbch_calico-apiserver(91bf0287-40fc-4bcc-aa76-a7888b9a94ef): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:25:56.742188 kubelet[2858]: E1216 12:25:56.742112 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-tzbch" podUID="91bf0287-40fc-4bcc-aa76-a7888b9a94ef" Dec 16 12:25:57.045374 containerd[1611]: time="2025-12-16T12:25:57.045036634Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-snrff,Uid:b72f115e-557a-48f9-b3b7-382ad8af5dee,Namespace:kube-system,Attempt:0,}" Dec 16 12:25:57.048120 containerd[1611]: time="2025-12-16T12:25:57.047771624Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-9mpwx,Uid:07aaeefc-07f0-4e68-bf3e-404493256fb6,Namespace:kube-system,Attempt:0,}" Dec 16 12:25:57.230348 systemd-networkd[1501]: cali665c3eda36f: Link UP Dec 16 12:25:57.231564 systemd-networkd[1501]: cali665c3eda36f: Gained carrier Dec 16 12:25:57.254573 containerd[1611]: 2025-12-16 12:25:57.112 [INFO][4654] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--6--95bdd2e3e7-k8s-coredns--66bc5c9577--snrff-eth0 coredns-66bc5c9577- kube-system b72f115e-557a-48f9-b3b7-382ad8af5dee 851 0 2025-12-16 12:25:11 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4515-1-0-6-95bdd2e3e7 coredns-66bc5c9577-snrff eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali665c3eda36f [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="e4e59e919077b497d4527b68284f4ecd12b823f6827ecf992af06943a9388bf3" Namespace="kube-system" Pod="coredns-66bc5c9577-snrff" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-coredns--66bc5c9577--snrff-" Dec 16 12:25:57.254573 containerd[1611]: 2025-12-16 12:25:57.112 [INFO][4654] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="e4e59e919077b497d4527b68284f4ecd12b823f6827ecf992af06943a9388bf3" Namespace="kube-system" Pod="coredns-66bc5c9577-snrff" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-coredns--66bc5c9577--snrff-eth0" Dec 16 12:25:57.254573 containerd[1611]: 2025-12-16 12:25:57.161 [INFO][4680] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="e4e59e919077b497d4527b68284f4ecd12b823f6827ecf992af06943a9388bf3" HandleID="k8s-pod-network.e4e59e919077b497d4527b68284f4ecd12b823f6827ecf992af06943a9388bf3" Workload="ci--4515--1--0--6--95bdd2e3e7-k8s-coredns--66bc5c9577--snrff-eth0" Dec 16 12:25:57.254573 containerd[1611]: 2025-12-16 12:25:57.161 [INFO][4680] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="e4e59e919077b497d4527b68284f4ecd12b823f6827ecf992af06943a9388bf3" HandleID="k8s-pod-network.e4e59e919077b497d4527b68284f4ecd12b823f6827ecf992af06943a9388bf3" Workload="ci--4515--1--0--6--95bdd2e3e7-k8s-coredns--66bc5c9577--snrff-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b120), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4515-1-0-6-95bdd2e3e7", "pod":"coredns-66bc5c9577-snrff", "timestamp":"2025-12-16 12:25:57.161295053 +0000 UTC"}, Hostname:"ci-4515-1-0-6-95bdd2e3e7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:25:57.254573 containerd[1611]: 2025-12-16 12:25:57.161 [INFO][4680] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:25:57.254573 containerd[1611]: 2025-12-16 12:25:57.161 [INFO][4680] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:25:57.254573 containerd[1611]: 2025-12-16 12:25:57.161 [INFO][4680] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-6-95bdd2e3e7' Dec 16 12:25:57.254573 containerd[1611]: 2025-12-16 12:25:57.180 [INFO][4680] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.e4e59e919077b497d4527b68284f4ecd12b823f6827ecf992af06943a9388bf3" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:57.254573 containerd[1611]: 2025-12-16 12:25:57.188 [INFO][4680] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:57.254573 containerd[1611]: 2025-12-16 12:25:57.194 [INFO][4680] ipam/ipam.go 511: Trying affinity for 192.168.63.64/26 host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:57.254573 containerd[1611]: 2025-12-16 12:25:57.198 [INFO][4680] ipam/ipam.go 158: Attempting to load block cidr=192.168.63.64/26 host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:57.254573 containerd[1611]: 2025-12-16 12:25:57.202 [INFO][4680] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.63.64/26 host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:57.254573 containerd[1611]: 2025-12-16 12:25:57.202 [INFO][4680] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.63.64/26 handle="k8s-pod-network.e4e59e919077b497d4527b68284f4ecd12b823f6827ecf992af06943a9388bf3" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:57.254573 containerd[1611]: 2025-12-16 12:25:57.205 [INFO][4680] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.e4e59e919077b497d4527b68284f4ecd12b823f6827ecf992af06943a9388bf3 Dec 16 12:25:57.254573 containerd[1611]: 2025-12-16 12:25:57.211 [INFO][4680] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.63.64/26 handle="k8s-pod-network.e4e59e919077b497d4527b68284f4ecd12b823f6827ecf992af06943a9388bf3" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:57.254573 containerd[1611]: 2025-12-16 12:25:57.220 [INFO][4680] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.63.71/26] block=192.168.63.64/26 handle="k8s-pod-network.e4e59e919077b497d4527b68284f4ecd12b823f6827ecf992af06943a9388bf3" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:57.254573 containerd[1611]: 2025-12-16 12:25:57.220 [INFO][4680] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.63.71/26] handle="k8s-pod-network.e4e59e919077b497d4527b68284f4ecd12b823f6827ecf992af06943a9388bf3" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:57.254573 containerd[1611]: 2025-12-16 12:25:57.220 [INFO][4680] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:25:57.254573 containerd[1611]: 2025-12-16 12:25:57.220 [INFO][4680] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.63.71/26] IPv6=[] ContainerID="e4e59e919077b497d4527b68284f4ecd12b823f6827ecf992af06943a9388bf3" HandleID="k8s-pod-network.e4e59e919077b497d4527b68284f4ecd12b823f6827ecf992af06943a9388bf3" Workload="ci--4515--1--0--6--95bdd2e3e7-k8s-coredns--66bc5c9577--snrff-eth0" Dec 16 12:25:57.256248 containerd[1611]: 2025-12-16 12:25:57.226 [INFO][4654] cni-plugin/k8s.go 418: Populated endpoint ContainerID="e4e59e919077b497d4527b68284f4ecd12b823f6827ecf992af06943a9388bf3" Namespace="kube-system" Pod="coredns-66bc5c9577-snrff" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-coredns--66bc5c9577--snrff-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--6--95bdd2e3e7-k8s-coredns--66bc5c9577--snrff-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"b72f115e-557a-48f9-b3b7-382ad8af5dee", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 25, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-6-95bdd2e3e7", ContainerID:"", Pod:"coredns-66bc5c9577-snrff", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.63.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali665c3eda36f", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:25:57.256248 containerd[1611]: 2025-12-16 12:25:57.227 [INFO][4654] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.63.71/32] ContainerID="e4e59e919077b497d4527b68284f4ecd12b823f6827ecf992af06943a9388bf3" Namespace="kube-system" Pod="coredns-66bc5c9577-snrff" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-coredns--66bc5c9577--snrff-eth0" Dec 16 12:25:57.256248 containerd[1611]: 2025-12-16 12:25:57.227 [INFO][4654] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali665c3eda36f ContainerID="e4e59e919077b497d4527b68284f4ecd12b823f6827ecf992af06943a9388bf3" Namespace="kube-system" Pod="coredns-66bc5c9577-snrff" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-coredns--66bc5c9577--snrff-eth0" Dec 16 12:25:57.256248 containerd[1611]: 2025-12-16 12:25:57.231 [INFO][4654] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="e4e59e919077b497d4527b68284f4ecd12b823f6827ecf992af06943a9388bf3" Namespace="kube-system" Pod="coredns-66bc5c9577-snrff" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-coredns--66bc5c9577--snrff-eth0" Dec 16 12:25:57.256446 containerd[1611]: 2025-12-16 12:25:57.232 [INFO][4654] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="e4e59e919077b497d4527b68284f4ecd12b823f6827ecf992af06943a9388bf3" Namespace="kube-system" Pod="coredns-66bc5c9577-snrff" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-coredns--66bc5c9577--snrff-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--6--95bdd2e3e7-k8s-coredns--66bc5c9577--snrff-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"b72f115e-557a-48f9-b3b7-382ad8af5dee", ResourceVersion:"851", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 25, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-6-95bdd2e3e7", ContainerID:"e4e59e919077b497d4527b68284f4ecd12b823f6827ecf992af06943a9388bf3", Pod:"coredns-66bc5c9577-snrff", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.63.71/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali665c3eda36f", MAC:"ce:0e:02:d4:f5:9e", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:25:57.256446 containerd[1611]: 2025-12-16 12:25:57.249 [INFO][4654] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="e4e59e919077b497d4527b68284f4ecd12b823f6827ecf992af06943a9388bf3" Namespace="kube-system" Pod="coredns-66bc5c9577-snrff" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-coredns--66bc5c9577--snrff-eth0" Dec 16 12:25:57.280017 containerd[1611]: time="2025-12-16T12:25:57.279973263Z" level=info msg="connecting to shim e4e59e919077b497d4527b68284f4ecd12b823f6827ecf992af06943a9388bf3" address="unix:///run/containerd/s/1d798ce75ab76a26f1e404f6be65320401882d74f60fb756323980593f550b5f" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:25:57.287000 audit[4711]: NETFILTER_CFG table=filter:135 family=2 entries=58 op=nft_register_chain pid=4711 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:25:57.287000 audit[4711]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=27288 a0=3 a1=ffffd7642400 a2=0 a3=ffff9b030fa8 items=0 ppid=3958 pid=4711 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.287000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:25:57.324842 systemd[1]: Started cri-containerd-e4e59e919077b497d4527b68284f4ecd12b823f6827ecf992af06943a9388bf3.scope - libcontainer container e4e59e919077b497d4527b68284f4ecd12b823f6827ecf992af06943a9388bf3. Dec 16 12:25:57.347000 audit: BPF prog-id=234 op=LOAD Dec 16 12:25:57.348000 audit: BPF prog-id=235 op=LOAD Dec 16 12:25:57.348000 audit[4722]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000128180 a2=98 a3=0 items=0 ppid=4710 pid=4722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.348000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6534653539653931393037376234393764343532376236383238346634 Dec 16 12:25:57.349000 audit: BPF prog-id=235 op=UNLOAD Dec 16 12:25:57.349000 audit[4722]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4710 pid=4722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.349000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6534653539653931393037376234393764343532376236383238346634 Dec 16 12:25:57.349000 audit: BPF prog-id=236 op=LOAD Dec 16 12:25:57.349000 audit[4722]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001283e8 a2=98 a3=0 items=0 ppid=4710 pid=4722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.349000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6534653539653931393037376234393764343532376236383238346634 Dec 16 12:25:57.349000 audit: BPF prog-id=237 op=LOAD Dec 16 12:25:57.349000 audit[4722]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000128168 a2=98 a3=0 items=0 ppid=4710 pid=4722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.349000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6534653539653931393037376234393764343532376236383238346634 Dec 16 12:25:57.349000 audit: BPF prog-id=237 op=UNLOAD Dec 16 12:25:57.349000 audit[4722]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=4710 pid=4722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.349000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6534653539653931393037376234393764343532376236383238346634 Dec 16 12:25:57.349000 audit: BPF prog-id=236 op=UNLOAD Dec 16 12:25:57.349000 audit[4722]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=4710 pid=4722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.349000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6534653539653931393037376234393764343532376236383238346634 Dec 16 12:25:57.349000 audit: BPF prog-id=238 op=LOAD Dec 16 12:25:57.349000 audit[4722]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000128648 a2=98 a3=0 items=0 ppid=4710 pid=4722 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.349000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6534653539653931393037376234393764343532376236383238346634 Dec 16 12:25:57.350889 systemd-networkd[1501]: cali471b769b74c: Link UP Dec 16 12:25:57.355629 systemd-networkd[1501]: cali471b769b74c: Gained carrier Dec 16 12:25:57.358947 systemd-networkd[1501]: calia313dd5d69e: Gained IPv6LL Dec 16 12:25:57.376801 kubelet[2858]: E1216 12:25:57.375084 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-tzbch" podUID="91bf0287-40fc-4bcc-aa76-a7888b9a94ef" Dec 16 12:25:57.379856 kubelet[2858]: E1216 12:25:57.379578 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-psf64" podUID="3de277ef-70c9-4b08-8b83-d92a9680c7b8" Dec 16 12:25:57.391983 containerd[1611]: 2025-12-16 12:25:57.126 [INFO][4655] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4515--1--0--6--95bdd2e3e7-k8s-coredns--66bc5c9577--9mpwx-eth0 coredns-66bc5c9577- kube-system 07aaeefc-07f0-4e68-bf3e-404493256fb6 853 0 2025-12-16 12:25:11 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4515-1-0-6-95bdd2e3e7 coredns-66bc5c9577-9mpwx eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali471b769b74c [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="79f85244aab2686feeee4a631b33bebc764f9a42df22571199f06c71a4c9d03f" Namespace="kube-system" Pod="coredns-66bc5c9577-9mpwx" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-coredns--66bc5c9577--9mpwx-" Dec 16 12:25:57.391983 containerd[1611]: 2025-12-16 12:25:57.126 [INFO][4655] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="79f85244aab2686feeee4a631b33bebc764f9a42df22571199f06c71a4c9d03f" Namespace="kube-system" Pod="coredns-66bc5c9577-9mpwx" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-coredns--66bc5c9577--9mpwx-eth0" Dec 16 12:25:57.391983 containerd[1611]: 2025-12-16 12:25:57.177 [INFO][4685] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="79f85244aab2686feeee4a631b33bebc764f9a42df22571199f06c71a4c9d03f" HandleID="k8s-pod-network.79f85244aab2686feeee4a631b33bebc764f9a42df22571199f06c71a4c9d03f" Workload="ci--4515--1--0--6--95bdd2e3e7-k8s-coredns--66bc5c9577--9mpwx-eth0" Dec 16 12:25:57.391983 containerd[1611]: 2025-12-16 12:25:57.178 [INFO][4685] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="79f85244aab2686feeee4a631b33bebc764f9a42df22571199f06c71a4c9d03f" HandleID="k8s-pod-network.79f85244aab2686feeee4a631b33bebc764f9a42df22571199f06c71a4c9d03f" Workload="ci--4515--1--0--6--95bdd2e3e7-k8s-coredns--66bc5c9577--9mpwx-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002cba00), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4515-1-0-6-95bdd2e3e7", "pod":"coredns-66bc5c9577-9mpwx", "timestamp":"2025-12-16 12:25:57.177028556 +0000 UTC"}, Hostname:"ci-4515-1-0-6-95bdd2e3e7", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Dec 16 12:25:57.391983 containerd[1611]: 2025-12-16 12:25:57.178 [INFO][4685] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Dec 16 12:25:57.391983 containerd[1611]: 2025-12-16 12:25:57.220 [INFO][4685] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Dec 16 12:25:57.391983 containerd[1611]: 2025-12-16 12:25:57.220 [INFO][4685] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4515-1-0-6-95bdd2e3e7' Dec 16 12:25:57.391983 containerd[1611]: 2025-12-16 12:25:57.282 [INFO][4685] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.79f85244aab2686feeee4a631b33bebc764f9a42df22571199f06c71a4c9d03f" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:57.391983 containerd[1611]: 2025-12-16 12:25:57.296 [INFO][4685] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:57.391983 containerd[1611]: 2025-12-16 12:25:57.306 [INFO][4685] ipam/ipam.go 511: Trying affinity for 192.168.63.64/26 host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:57.391983 containerd[1611]: 2025-12-16 12:25:57.309 [INFO][4685] ipam/ipam.go 158: Attempting to load block cidr=192.168.63.64/26 host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:57.391983 containerd[1611]: 2025-12-16 12:25:57.314 [INFO][4685] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.63.64/26 host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:57.391983 containerd[1611]: 2025-12-16 12:25:57.314 [INFO][4685] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.63.64/26 handle="k8s-pod-network.79f85244aab2686feeee4a631b33bebc764f9a42df22571199f06c71a4c9d03f" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:57.391983 containerd[1611]: 2025-12-16 12:25:57.316 [INFO][4685] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.79f85244aab2686feeee4a631b33bebc764f9a42df22571199f06c71a4c9d03f Dec 16 12:25:57.391983 containerd[1611]: 2025-12-16 12:25:57.328 [INFO][4685] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.63.64/26 handle="k8s-pod-network.79f85244aab2686feeee4a631b33bebc764f9a42df22571199f06c71a4c9d03f" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:57.391983 containerd[1611]: 2025-12-16 12:25:57.341 [INFO][4685] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.63.72/26] block=192.168.63.64/26 handle="k8s-pod-network.79f85244aab2686feeee4a631b33bebc764f9a42df22571199f06c71a4c9d03f" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:57.391983 containerd[1611]: 2025-12-16 12:25:57.341 [INFO][4685] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.63.72/26] handle="k8s-pod-network.79f85244aab2686feeee4a631b33bebc764f9a42df22571199f06c71a4c9d03f" host="ci-4515-1-0-6-95bdd2e3e7" Dec 16 12:25:57.391983 containerd[1611]: 2025-12-16 12:25:57.341 [INFO][4685] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Dec 16 12:25:57.391983 containerd[1611]: 2025-12-16 12:25:57.341 [INFO][4685] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.63.72/26] IPv6=[] ContainerID="79f85244aab2686feeee4a631b33bebc764f9a42df22571199f06c71a4c9d03f" HandleID="k8s-pod-network.79f85244aab2686feeee4a631b33bebc764f9a42df22571199f06c71a4c9d03f" Workload="ci--4515--1--0--6--95bdd2e3e7-k8s-coredns--66bc5c9577--9mpwx-eth0" Dec 16 12:25:57.392781 containerd[1611]: 2025-12-16 12:25:57.345 [INFO][4655] cni-plugin/k8s.go 418: Populated endpoint ContainerID="79f85244aab2686feeee4a631b33bebc764f9a42df22571199f06c71a4c9d03f" Namespace="kube-system" Pod="coredns-66bc5c9577-9mpwx" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-coredns--66bc5c9577--9mpwx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--6--95bdd2e3e7-k8s-coredns--66bc5c9577--9mpwx-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"07aaeefc-07f0-4e68-bf3e-404493256fb6", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 25, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-6-95bdd2e3e7", ContainerID:"", Pod:"coredns-66bc5c9577-9mpwx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.63.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali471b769b74c", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:25:57.392781 containerd[1611]: 2025-12-16 12:25:57.346 [INFO][4655] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.63.72/32] ContainerID="79f85244aab2686feeee4a631b33bebc764f9a42df22571199f06c71a4c9d03f" Namespace="kube-system" Pod="coredns-66bc5c9577-9mpwx" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-coredns--66bc5c9577--9mpwx-eth0" Dec 16 12:25:57.392781 containerd[1611]: 2025-12-16 12:25:57.346 [INFO][4655] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali471b769b74c ContainerID="79f85244aab2686feeee4a631b33bebc764f9a42df22571199f06c71a4c9d03f" Namespace="kube-system" Pod="coredns-66bc5c9577-9mpwx" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-coredns--66bc5c9577--9mpwx-eth0" Dec 16 12:25:57.392781 containerd[1611]: 2025-12-16 12:25:57.356 [INFO][4655] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="79f85244aab2686feeee4a631b33bebc764f9a42df22571199f06c71a4c9d03f" Namespace="kube-system" Pod="coredns-66bc5c9577-9mpwx" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-coredns--66bc5c9577--9mpwx-eth0" Dec 16 12:25:57.392993 containerd[1611]: 2025-12-16 12:25:57.357 [INFO][4655] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="79f85244aab2686feeee4a631b33bebc764f9a42df22571199f06c71a4c9d03f" Namespace="kube-system" Pod="coredns-66bc5c9577-9mpwx" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-coredns--66bc5c9577--9mpwx-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4515--1--0--6--95bdd2e3e7-k8s-coredns--66bc5c9577--9mpwx-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"07aaeefc-07f0-4e68-bf3e-404493256fb6", ResourceVersion:"853", Generation:0, CreationTimestamp:time.Date(2025, time.December, 16, 12, 25, 11, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4515-1-0-6-95bdd2e3e7", ContainerID:"79f85244aab2686feeee4a631b33bebc764f9a42df22571199f06c71a4c9d03f", Pod:"coredns-66bc5c9577-9mpwx", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.63.72/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali471b769b74c", MAC:"f2:40:2b:d4:09:d2", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Dec 16 12:25:57.392993 containerd[1611]: 2025-12-16 12:25:57.386 [INFO][4655] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="79f85244aab2686feeee4a631b33bebc764f9a42df22571199f06c71a4c9d03f" Namespace="kube-system" Pod="coredns-66bc5c9577-9mpwx" WorkloadEndpoint="ci--4515--1--0--6--95bdd2e3e7-k8s-coredns--66bc5c9577--9mpwx-eth0" Dec 16 12:25:57.428719 containerd[1611]: time="2025-12-16T12:25:57.428661245Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-snrff,Uid:b72f115e-557a-48f9-b3b7-382ad8af5dee,Namespace:kube-system,Attempt:0,} returns sandbox id \"e4e59e919077b497d4527b68284f4ecd12b823f6827ecf992af06943a9388bf3\"" Dec 16 12:25:57.447022 containerd[1611]: time="2025-12-16T12:25:57.446729419Z" level=info msg="CreateContainer within sandbox \"e4e59e919077b497d4527b68284f4ecd12b823f6827ecf992af06943a9388bf3\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 16 12:25:57.471764 containerd[1611]: time="2025-12-16T12:25:57.471662409Z" level=info msg="connecting to shim 79f85244aab2686feeee4a631b33bebc764f9a42df22571199f06c71a4c9d03f" address="unix:///run/containerd/s/44dab41645832518f472c563f4133810c874b7261e564611ce386127b65eeb0e" namespace=k8s.io protocol=ttrpc version=3 Dec 16 12:25:57.501540 containerd[1611]: time="2025-12-16T12:25:57.501495221Z" level=info msg="Container e66fb6b3ebbb5240c47e5ac91c3b979f06c34115173131987c3927557f37711b: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:25:57.510000 audit[4765]: NETFILTER_CFG table=filter:136 family=2 entries=20 op=nft_register_rule pid=4765 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:57.510000 audit[4765]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffc0f75780 a2=0 a3=1 items=0 ppid=2994 pid=4765 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.510000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:57.519740 containerd[1611]: time="2025-12-16T12:25:57.518693919Z" level=info msg="CreateContainer within sandbox \"e4e59e919077b497d4527b68284f4ecd12b823f6827ecf992af06943a9388bf3\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"e66fb6b3ebbb5240c47e5ac91c3b979f06c34115173131987c3927557f37711b\"" Dec 16 12:25:57.521507 containerd[1611]: time="2025-12-16T12:25:57.521459589Z" level=info msg="StartContainer for \"e66fb6b3ebbb5240c47e5ac91c3b979f06c34115173131987c3927557f37711b\"" Dec 16 12:25:57.522000 audit[4785]: NETFILTER_CFG table=filter:137 family=2 entries=52 op=nft_register_chain pid=4785 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Dec 16 12:25:57.522000 audit[4785]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=23892 a0=3 a1=ffffe1b9e450 a2=0 a3=ffffb33fffa8 items=0 ppid=3958 pid=4785 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.522000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Dec 16 12:25:57.524134 containerd[1611]: time="2025-12-16T12:25:57.523627741Z" level=info msg="connecting to shim e66fb6b3ebbb5240c47e5ac91c3b979f06c34115173131987c3927557f37711b" address="unix:///run/containerd/s/1d798ce75ab76a26f1e404f6be65320401882d74f60fb756323980593f550b5f" protocol=ttrpc version=3 Dec 16 12:25:57.536000 audit[4765]: NETFILTER_CFG table=nat:138 family=2 entries=14 op=nft_register_rule pid=4765 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:57.536000 audit[4765]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffc0f75780 a2=0 a3=1 items=0 ppid=2994 pid=4765 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.536000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:57.548538 systemd[1]: Started cri-containerd-79f85244aab2686feeee4a631b33bebc764f9a42df22571199f06c71a4c9d03f.scope - libcontainer container 79f85244aab2686feeee4a631b33bebc764f9a42df22571199f06c71a4c9d03f. Dec 16 12:25:57.569000 audit: BPF prog-id=239 op=LOAD Dec 16 12:25:57.571000 audit: BPF prog-id=240 op=LOAD Dec 16 12:25:57.571000 audit[4779]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=4767 pid=4779 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.571000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739663835323434616162323638366665656565346136333162333362 Dec 16 12:25:57.572000 audit: BPF prog-id=240 op=UNLOAD Dec 16 12:25:57.572000 audit[4779]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4767 pid=4779 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.572000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739663835323434616162323638366665656565346136333162333362 Dec 16 12:25:57.572000 audit: BPF prog-id=241 op=LOAD Dec 16 12:25:57.572000 audit[4779]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=4767 pid=4779 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.572000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739663835323434616162323638366665656565346136333162333362 Dec 16 12:25:57.572000 audit: BPF prog-id=242 op=LOAD Dec 16 12:25:57.572000 audit[4779]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a0168 a2=98 a3=0 items=0 ppid=4767 pid=4779 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.572000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739663835323434616162323638366665656565346136333162333362 Dec 16 12:25:57.573000 audit: BPF prog-id=242 op=UNLOAD Dec 16 12:25:57.573000 audit[4779]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4767 pid=4779 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.573000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739663835323434616162323638366665656565346136333162333362 Dec 16 12:25:57.573000 audit: BPF prog-id=241 op=UNLOAD Dec 16 12:25:57.573000 audit[4779]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4767 pid=4779 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.573000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739663835323434616162323638366665656565346136333162333362 Dec 16 12:25:57.573000 audit: BPF prog-id=243 op=LOAD Dec 16 12:25:57.573000 audit[4779]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0648 a2=98 a3=0 items=0 ppid=4767 pid=4779 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.573000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3739663835323434616162323638366665656565346136333162333362 Dec 16 12:25:57.594536 systemd[1]: Started cri-containerd-e66fb6b3ebbb5240c47e5ac91c3b979f06c34115173131987c3927557f37711b.scope - libcontainer container e66fb6b3ebbb5240c47e5ac91c3b979f06c34115173131987c3927557f37711b. Dec 16 12:25:57.633000 audit: BPF prog-id=244 op=LOAD Dec 16 12:25:57.634000 audit: BPF prog-id=245 op=LOAD Dec 16 12:25:57.634000 audit[4792]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0180 a2=98 a3=0 items=0 ppid=4710 pid=4792 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.634000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6536366662366233656262623532343063343765356163393163336239 Dec 16 12:25:57.634000 audit: BPF prog-id=245 op=UNLOAD Dec 16 12:25:57.634000 audit[4792]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4710 pid=4792 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.634000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6536366662366233656262623532343063343765356163393163336239 Dec 16 12:25:57.634000 audit: BPF prog-id=246 op=LOAD Dec 16 12:25:57.634000 audit[4792]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b03e8 a2=98 a3=0 items=0 ppid=4710 pid=4792 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.634000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6536366662366233656262623532343063343765356163393163336239 Dec 16 12:25:57.635000 audit: BPF prog-id=247 op=LOAD Dec 16 12:25:57.635000 audit[4792]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001b0168 a2=98 a3=0 items=0 ppid=4710 pid=4792 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.635000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6536366662366233656262623532343063343765356163393163336239 Dec 16 12:25:57.635000 audit: BPF prog-id=247 op=UNLOAD Dec 16 12:25:57.635000 audit[4792]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4710 pid=4792 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.635000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6536366662366233656262623532343063343765356163393163336239 Dec 16 12:25:57.635000 audit: BPF prog-id=246 op=UNLOAD Dec 16 12:25:57.635000 audit[4792]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4710 pid=4792 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.635000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6536366662366233656262623532343063343765356163393163336239 Dec 16 12:25:57.635000 audit: BPF prog-id=248 op=LOAD Dec 16 12:25:57.635000 audit[4792]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0648 a2=98 a3=0 items=0 ppid=4710 pid=4792 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.635000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6536366662366233656262623532343063343765356163393163336239 Dec 16 12:25:57.637866 containerd[1611]: time="2025-12-16T12:25:57.637824688Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-9mpwx,Uid:07aaeefc-07f0-4e68-bf3e-404493256fb6,Namespace:kube-system,Attempt:0,} returns sandbox id \"79f85244aab2686feeee4a631b33bebc764f9a42df22571199f06c71a4c9d03f\"" Dec 16 12:25:57.647845 containerd[1611]: time="2025-12-16T12:25:57.647654692Z" level=info msg="CreateContainer within sandbox \"79f85244aab2686feeee4a631b33bebc764f9a42df22571199f06c71a4c9d03f\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Dec 16 12:25:57.678787 containerd[1611]: time="2025-12-16T12:25:57.677765383Z" level=info msg="Container d07d75ae802f6eacca53ff10f589285f7ab3c5fed10326d9acb9ab189120ee24: CDI devices from CRI Config.CDIDevices: []" Dec 16 12:25:57.680776 containerd[1611]: time="2025-12-16T12:25:57.680745412Z" level=info msg="StartContainer for \"e66fb6b3ebbb5240c47e5ac91c3b979f06c34115173131987c3927557f37711b\" returns successfully" Dec 16 12:25:57.686860 containerd[1611]: time="2025-12-16T12:25:57.686133713Z" level=info msg="CreateContainer within sandbox \"79f85244aab2686feeee4a631b33bebc764f9a42df22571199f06c71a4c9d03f\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"d07d75ae802f6eacca53ff10f589285f7ab3c5fed10326d9acb9ab189120ee24\"" Dec 16 12:25:57.688767 containerd[1611]: time="2025-12-16T12:25:57.688712663Z" level=info msg="StartContainer for \"d07d75ae802f6eacca53ff10f589285f7ab3c5fed10326d9acb9ab189120ee24\"" Dec 16 12:25:57.689714 containerd[1611]: time="2025-12-16T12:25:57.689630620Z" level=info msg="connecting to shim d07d75ae802f6eacca53ff10f589285f7ab3c5fed10326d9acb9ab189120ee24" address="unix:///run/containerd/s/44dab41645832518f472c563f4133810c874b7261e564611ce386127b65eeb0e" protocol=ttrpc version=3 Dec 16 12:25:57.718633 systemd[1]: Started cri-containerd-d07d75ae802f6eacca53ff10f589285f7ab3c5fed10326d9acb9ab189120ee24.scope - libcontainer container d07d75ae802f6eacca53ff10f589285f7ab3c5fed10326d9acb9ab189120ee24. Dec 16 12:25:57.741000 audit: BPF prog-id=249 op=LOAD Dec 16 12:25:57.742000 audit: BPF prog-id=250 op=LOAD Dec 16 12:25:57.742000 audit[4835]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=4767 pid=4835 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.742000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6430376437356165383032663665616363613533666631306635383932 Dec 16 12:25:57.742000 audit: BPF prog-id=250 op=UNLOAD Dec 16 12:25:57.742000 audit[4835]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4767 pid=4835 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.742000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6430376437356165383032663665616363613533666631306635383932 Dec 16 12:25:57.743000 audit: BPF prog-id=251 op=LOAD Dec 16 12:25:57.743000 audit[4835]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=4767 pid=4835 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.743000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6430376437356165383032663665616363613533666631306635383932 Dec 16 12:25:57.743000 audit: BPF prog-id=252 op=LOAD Dec 16 12:25:57.743000 audit[4835]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=4767 pid=4835 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.743000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6430376437356165383032663665616363613533666631306635383932 Dec 16 12:25:57.743000 audit: BPF prog-id=252 op=UNLOAD Dec 16 12:25:57.743000 audit[4835]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4767 pid=4835 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.743000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6430376437356165383032663665616363613533666631306635383932 Dec 16 12:25:57.743000 audit: BPF prog-id=251 op=UNLOAD Dec 16 12:25:57.743000 audit[4835]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4767 pid=4835 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.743000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6430376437356165383032663665616363613533666631306635383932 Dec 16 12:25:57.743000 audit: BPF prog-id=253 op=LOAD Dec 16 12:25:57.743000 audit[4835]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=4767 pid=4835 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:57.743000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6430376437356165383032663665616363613533666631306635383932 Dec 16 12:25:57.774834 containerd[1611]: time="2025-12-16T12:25:57.774791752Z" level=info msg="StartContainer for \"d07d75ae802f6eacca53ff10f589285f7ab3c5fed10326d9acb9ab189120ee24\" returns successfully" Dec 16 12:25:58.384032 kubelet[2858]: E1216 12:25:58.383737 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-tzbch" podUID="91bf0287-40fc-4bcc-aa76-a7888b9a94ef" Dec 16 12:25:58.402813 kubelet[2858]: I1216 12:25:58.401676 2858 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-snrff" podStartSLOduration=47.401653817 podStartE2EDuration="47.401653817s" podCreationTimestamp="2025-12-16 12:25:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:25:58.401059139 +0000 UTC m=+53.510564279" watchObservedRunningTime="2025-12-16 12:25:58.401653817 +0000 UTC m=+53.511158917" Dec 16 12:25:58.463545 kubelet[2858]: I1216 12:25:58.463462 2858 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-9mpwx" podStartSLOduration=47.463441436 podStartE2EDuration="47.463441436s" podCreationTimestamp="2025-12-16 12:25:11 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2025-12-16 12:25:58.440511398 +0000 UTC m=+53.550016498" watchObservedRunningTime="2025-12-16 12:25:58.463441436 +0000 UTC m=+53.572946536" Dec 16 12:25:58.554000 audit[4872]: NETFILTER_CFG table=filter:139 family=2 entries=17 op=nft_register_rule pid=4872 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:58.555656 kernel: kauditd_printk_skb: 224 callbacks suppressed Dec 16 12:25:58.555725 kernel: audit: type=1325 audit(1765887958.554:733): table=filter:139 family=2 entries=17 op=nft_register_rule pid=4872 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:58.554000 audit[4872]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffc7f858b0 a2=0 a3=1 items=0 ppid=2994 pid=4872 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:58.560317 kernel: audit: type=1300 audit(1765887958.554:733): arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffc7f858b0 a2=0 a3=1 items=0 ppid=2994 pid=4872 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:58.554000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:58.562177 kernel: audit: type=1327 audit(1765887958.554:733): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:58.562000 audit[4872]: NETFILTER_CFG table=nat:140 family=2 entries=35 op=nft_register_chain pid=4872 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:58.565309 kernel: audit: type=1325 audit(1765887958.562:734): table=nat:140 family=2 entries=35 op=nft_register_chain pid=4872 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:58.565401 kernel: audit: type=1300 audit(1765887958.562:734): arch=c00000b7 syscall=211 success=yes exit=14196 a0=3 a1=ffffc7f858b0 a2=0 a3=1 items=0 ppid=2994 pid=4872 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:58.562000 audit[4872]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14196 a0=3 a1=ffffc7f858b0 a2=0 a3=1 items=0 ppid=2994 pid=4872 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:58.562000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:58.568493 kernel: audit: type=1327 audit(1765887958.562:734): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:58.766609 systemd-networkd[1501]: cali471b769b74c: Gained IPv6LL Dec 16 12:25:59.280553 systemd-networkd[1501]: cali665c3eda36f: Gained IPv6LL Dec 16 12:25:59.583000 audit[4876]: NETFILTER_CFG table=filter:141 family=2 entries=14 op=nft_register_rule pid=4876 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:59.583000 audit[4876]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=fffff17aa140 a2=0 a3=1 items=0 ppid=2994 pid=4876 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:59.588836 kernel: audit: type=1325 audit(1765887959.583:735): table=filter:141 family=2 entries=14 op=nft_register_rule pid=4876 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:59.588954 kernel: audit: type=1300 audit(1765887959.583:735): arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=fffff17aa140 a2=0 a3=1 items=0 ppid=2994 pid=4876 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:59.588975 kernel: audit: type=1327 audit(1765887959.583:735): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:59.583000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:25:59.600000 audit[4876]: NETFILTER_CFG table=nat:142 family=2 entries=56 op=nft_register_chain pid=4876 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:59.602350 kernel: audit: type=1325 audit(1765887959.600:736): table=nat:142 family=2 entries=56 op=nft_register_chain pid=4876 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:25:59.600000 audit[4876]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=19860 a0=3 a1=fffff17aa140 a2=0 a3=1 items=0 ppid=2994 pid=4876 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:25:59.600000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:26:03.042936 containerd[1611]: time="2025-12-16T12:26:03.042827265Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:26:03.421997 containerd[1611]: time="2025-12-16T12:26:03.421325107Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:26:03.422774 containerd[1611]: time="2025-12-16T12:26:03.422722577Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:26:03.423374 containerd[1611]: time="2025-12-16T12:26:03.422828862Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:26:03.423560 kubelet[2858]: E1216 12:26:03.423373 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:26:03.423560 kubelet[2858]: E1216 12:26:03.423545 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:26:03.424575 kubelet[2858]: E1216 12:26:03.423647 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-5fd8987bcc-q5l6n_calico-system(75b3ffde-bcfa-4d53-8c6e-f6a65e50e276): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:26:03.426118 containerd[1611]: time="2025-12-16T12:26:03.426074824Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:26:03.776885 containerd[1611]: time="2025-12-16T12:26:03.776789560Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:26:03.778187 containerd[1611]: time="2025-12-16T12:26:03.778117626Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:26:03.778442 containerd[1611]: time="2025-12-16T12:26:03.778217151Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:26:03.778495 kubelet[2858]: E1216 12:26:03.778450 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:26:03.778613 kubelet[2858]: E1216 12:26:03.778496 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:26:03.778613 kubelet[2858]: E1216 12:26:03.778568 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-5fd8987bcc-q5l6n_calico-system(75b3ffde-bcfa-4d53-8c6e-f6a65e50e276): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:26:03.778725 kubelet[2858]: E1216 12:26:03.778609 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5fd8987bcc-q5l6n" podUID="75b3ffde-bcfa-4d53-8c6e-f6a65e50e276" Dec 16 12:26:07.041547 containerd[1611]: time="2025-12-16T12:26:07.041476952Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:26:07.397052 containerd[1611]: time="2025-12-16T12:26:07.396669798Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:26:07.398520 containerd[1611]: time="2025-12-16T12:26:07.398438796Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:26:07.398692 containerd[1611]: time="2025-12-16T12:26:07.398573682Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:26:07.399009 kubelet[2858]: E1216 12:26:07.398958 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:26:07.400077 kubelet[2858]: E1216 12:26:07.399597 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:26:07.400077 kubelet[2858]: E1216 12:26:07.399920 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-5kz8n_calico-system(e5eca681-3514-4805-908a-df03e7d148ad): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:26:07.400590 kubelet[2858]: E1216 12:26:07.400376 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-5kz8n" podUID="e5eca681-3514-4805-908a-df03e7d148ad" Dec 16 12:26:08.043322 containerd[1611]: time="2025-12-16T12:26:08.042736868Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:26:08.393880 containerd[1611]: time="2025-12-16T12:26:08.393513864Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:26:08.395363 containerd[1611]: time="2025-12-16T12:26:08.395288421Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:26:08.395656 containerd[1611]: time="2025-12-16T12:26:08.395393185Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:26:08.395983 kubelet[2858]: E1216 12:26:08.395629 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:26:08.395983 kubelet[2858]: E1216 12:26:08.395702 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:26:08.395983 kubelet[2858]: E1216 12:26:08.395880 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-psf64_calico-system(3de277ef-70c9-4b08-8b83-d92a9680c7b8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:26:08.396216 containerd[1611]: time="2025-12-16T12:26:08.396185899Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:26:08.763625 containerd[1611]: time="2025-12-16T12:26:08.763550049Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:26:08.764998 containerd[1611]: time="2025-12-16T12:26:08.764896426Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:26:08.765137 containerd[1611]: time="2025-12-16T12:26:08.765014672Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:26:08.765590 kubelet[2858]: E1216 12:26:08.765484 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:26:08.765590 kubelet[2858]: E1216 12:26:08.765560 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:26:08.766713 kubelet[2858]: E1216 12:26:08.766030 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-79c4b989f5-ctclc_calico-apiserver(2053ff95-314d-4312-973b-a00f2cf38258): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:26:08.766713 kubelet[2858]: E1216 12:26:08.766071 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-ctclc" podUID="2053ff95-314d-4312-973b-a00f2cf38258" Dec 16 12:26:08.766903 containerd[1611]: time="2025-12-16T12:26:08.766046116Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:26:09.150923 containerd[1611]: time="2025-12-16T12:26:09.150728501Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:26:09.152443 containerd[1611]: time="2025-12-16T12:26:09.152382410Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:26:09.152879 containerd[1611]: time="2025-12-16T12:26:09.152670222Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:26:09.153286 kubelet[2858]: E1216 12:26:09.153044 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:26:09.153286 kubelet[2858]: E1216 12:26:09.153108 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:26:09.153286 kubelet[2858]: E1216 12:26:09.153198 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-psf64_calico-system(3de277ef-70c9-4b08-8b83-d92a9680c7b8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:26:09.153448 kubelet[2858]: E1216 12:26:09.153246 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-psf64" podUID="3de277ef-70c9-4b08-8b83-d92a9680c7b8" Dec 16 12:26:11.042837 containerd[1611]: time="2025-12-16T12:26:11.042751733Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:26:11.391733 containerd[1611]: time="2025-12-16T12:26:11.391232000Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:26:11.393305 containerd[1611]: time="2025-12-16T12:26:11.393063712Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:26:11.393535 kubelet[2858]: E1216 12:26:11.393458 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:26:11.394056 containerd[1611]: time="2025-12-16T12:26:11.393222318Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:26:11.394157 kubelet[2858]: E1216 12:26:11.393528 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:26:11.394157 kubelet[2858]: E1216 12:26:11.393802 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-6b97fd59-4xf4h_calico-system(6c115690-4c4b-4f0d-9563-71f7671c0428): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:26:11.394157 kubelet[2858]: E1216 12:26:11.393867 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b97fd59-4xf4h" podUID="6c115690-4c4b-4f0d-9563-71f7671c0428" Dec 16 12:26:11.395590 containerd[1611]: time="2025-12-16T12:26:11.394701856Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:26:11.750570 containerd[1611]: time="2025-12-16T12:26:11.750473410Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:26:11.751953 containerd[1611]: time="2025-12-16T12:26:11.751887025Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:26:11.752136 containerd[1611]: time="2025-12-16T12:26:11.751991349Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:26:11.752575 kubelet[2858]: E1216 12:26:11.752429 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:26:11.752575 kubelet[2858]: E1216 12:26:11.752508 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:26:11.752781 kubelet[2858]: E1216 12:26:11.752607 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-79c4b989f5-tzbch_calico-apiserver(91bf0287-40fc-4bcc-aa76-a7888b9a94ef): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:26:11.752781 kubelet[2858]: E1216 12:26:11.752647 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-tzbch" podUID="91bf0287-40fc-4bcc-aa76-a7888b9a94ef" Dec 16 12:26:16.044382 kubelet[2858]: E1216 12:26:16.044089 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5fd8987bcc-q5l6n" podUID="75b3ffde-bcfa-4d53-8c6e-f6a65e50e276" Dec 16 12:26:18.041765 kubelet[2858]: E1216 12:26:18.041689 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-5kz8n" podUID="e5eca681-3514-4805-908a-df03e7d148ad" Dec 16 12:26:20.045394 kubelet[2858]: E1216 12:26:20.045130 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-ctclc" podUID="2053ff95-314d-4312-973b-a00f2cf38258" Dec 16 12:26:20.046908 kubelet[2858]: E1216 12:26:20.046662 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-psf64" podUID="3de277ef-70c9-4b08-8b83-d92a9680c7b8" Dec 16 12:26:23.050624 kubelet[2858]: E1216 12:26:23.050188 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-tzbch" podUID="91bf0287-40fc-4bcc-aa76-a7888b9a94ef" Dec 16 12:26:26.042985 kubelet[2858]: E1216 12:26:26.042869 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b97fd59-4xf4h" podUID="6c115690-4c4b-4f0d-9563-71f7671c0428" Dec 16 12:26:27.047563 containerd[1611]: time="2025-12-16T12:26:27.047520558Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:26:27.467535 containerd[1611]: time="2025-12-16T12:26:27.467464131Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:26:27.469071 containerd[1611]: time="2025-12-16T12:26:27.469015050Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:26:27.469278 containerd[1611]: time="2025-12-16T12:26:27.469109452Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:26:27.470358 kubelet[2858]: E1216 12:26:27.470297 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:26:27.470911 kubelet[2858]: E1216 12:26:27.470377 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:26:27.470911 kubelet[2858]: E1216 12:26:27.470476 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-5fd8987bcc-q5l6n_calico-system(75b3ffde-bcfa-4d53-8c6e-f6a65e50e276): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:26:27.471715 containerd[1611]: time="2025-12-16T12:26:27.471667916Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:26:27.832543 containerd[1611]: time="2025-12-16T12:26:27.832395501Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:26:27.833554 containerd[1611]: time="2025-12-16T12:26:27.833496288Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:26:27.833650 containerd[1611]: time="2025-12-16T12:26:27.833598051Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:26:27.834230 kubelet[2858]: E1216 12:26:27.834175 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:26:27.834230 kubelet[2858]: E1216 12:26:27.834225 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:26:27.834379 kubelet[2858]: E1216 12:26:27.834304 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-5fd8987bcc-q5l6n_calico-system(75b3ffde-bcfa-4d53-8c6e-f6a65e50e276): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:26:27.834379 kubelet[2858]: E1216 12:26:27.834353 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5fd8987bcc-q5l6n" podUID="75b3ffde-bcfa-4d53-8c6e-f6a65e50e276" Dec 16 12:26:32.045001 containerd[1611]: time="2025-12-16T12:26:32.044950807Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:26:32.424545 containerd[1611]: time="2025-12-16T12:26:32.424410594Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:26:32.426730 containerd[1611]: time="2025-12-16T12:26:32.426644042Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:26:32.426843 containerd[1611]: time="2025-12-16T12:26:32.426770765Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:26:32.427133 kubelet[2858]: E1216 12:26:32.427095 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:26:32.428069 kubelet[2858]: E1216 12:26:32.427971 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:26:32.428246 kubelet[2858]: E1216 12:26:32.428223 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-psf64_calico-system(3de277ef-70c9-4b08-8b83-d92a9680c7b8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:26:32.429341 containerd[1611]: time="2025-12-16T12:26:32.428308718Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:26:32.894981 containerd[1611]: time="2025-12-16T12:26:32.894933545Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:26:32.896320 containerd[1611]: time="2025-12-16T12:26:32.896235213Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:26:32.896566 containerd[1611]: time="2025-12-16T12:26:32.896425857Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:26:32.896771 kubelet[2858]: E1216 12:26:32.896628 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:26:32.896771 kubelet[2858]: E1216 12:26:32.896672 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:26:32.897555 kubelet[2858]: E1216 12:26:32.897387 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-5kz8n_calico-system(e5eca681-3514-4805-908a-df03e7d148ad): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:26:32.897555 kubelet[2858]: E1216 12:26:32.897429 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-5kz8n" podUID="e5eca681-3514-4805-908a-df03e7d148ad" Dec 16 12:26:32.897629 containerd[1611]: time="2025-12-16T12:26:32.896996390Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:26:33.257006 containerd[1611]: time="2025-12-16T12:26:33.256589398Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:26:33.258701 containerd[1611]: time="2025-12-16T12:26:33.258537399Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:26:33.258929 containerd[1611]: time="2025-12-16T12:26:33.258553239Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:26:33.259579 kubelet[2858]: E1216 12:26:33.259471 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:26:33.259842 kubelet[2858]: E1216 12:26:33.259547 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:26:33.260201 kubelet[2858]: E1216 12:26:33.260066 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-psf64_calico-system(3de277ef-70c9-4b08-8b83-d92a9680c7b8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:26:33.261956 kubelet[2858]: E1216 12:26:33.260357 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-psf64" podUID="3de277ef-70c9-4b08-8b83-d92a9680c7b8" Dec 16 12:26:33.263206 containerd[1611]: time="2025-12-16T12:26:33.263145495Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:26:33.632881 containerd[1611]: time="2025-12-16T12:26:33.631953596Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:26:33.634238 containerd[1611]: time="2025-12-16T12:26:33.634181363Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:26:33.635059 containerd[1611]: time="2025-12-16T12:26:33.635006060Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:26:33.635359 kubelet[2858]: E1216 12:26:33.635262 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:26:33.635791 kubelet[2858]: E1216 12:26:33.635484 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:26:33.635791 kubelet[2858]: E1216 12:26:33.635581 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-79c4b989f5-ctclc_calico-apiserver(2053ff95-314d-4312-973b-a00f2cf38258): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:26:33.635791 kubelet[2858]: E1216 12:26:33.635626 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-ctclc" podUID="2053ff95-314d-4312-973b-a00f2cf38258" Dec 16 12:26:34.042437 containerd[1611]: time="2025-12-16T12:26:34.042311945Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:26:34.390088 containerd[1611]: time="2025-12-16T12:26:34.389910284Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:26:34.391706 containerd[1611]: time="2025-12-16T12:26:34.391602718Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:26:34.391827 containerd[1611]: time="2025-12-16T12:26:34.391701480Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:26:34.392195 kubelet[2858]: E1216 12:26:34.392150 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:26:34.392322 kubelet[2858]: E1216 12:26:34.392201 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:26:34.392322 kubelet[2858]: E1216 12:26:34.392286 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-79c4b989f5-tzbch_calico-apiserver(91bf0287-40fc-4bcc-aa76-a7888b9a94ef): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:26:34.392499 kubelet[2858]: E1216 12:26:34.392317 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-tzbch" podUID="91bf0287-40fc-4bcc-aa76-a7888b9a94ef" Dec 16 12:26:37.051334 containerd[1611]: time="2025-12-16T12:26:37.051254320Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:26:37.411781 containerd[1611]: time="2025-12-16T12:26:37.411575343Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:26:37.413689 containerd[1611]: time="2025-12-16T12:26:37.413615301Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:26:37.413838 containerd[1611]: time="2025-12-16T12:26:37.413726183Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:26:37.414011 kubelet[2858]: E1216 12:26:37.413971 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:26:37.415469 kubelet[2858]: E1216 12:26:37.414020 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:26:37.415469 kubelet[2858]: E1216 12:26:37.414086 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-6b97fd59-4xf4h_calico-system(6c115690-4c4b-4f0d-9563-71f7671c0428): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:26:37.415469 kubelet[2858]: E1216 12:26:37.414118 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b97fd59-4xf4h" podUID="6c115690-4c4b-4f0d-9563-71f7671c0428" Dec 16 12:26:40.041509 kubelet[2858]: E1216 12:26:40.041454 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5fd8987bcc-q5l6n" podUID="75b3ffde-bcfa-4d53-8c6e-f6a65e50e276" Dec 16 12:26:45.047674 kubelet[2858]: E1216 12:26:45.047597 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-psf64" podUID="3de277ef-70c9-4b08-8b83-d92a9680c7b8" Dec 16 12:26:46.042648 kubelet[2858]: E1216 12:26:46.042534 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-5kz8n" podUID="e5eca681-3514-4805-908a-df03e7d148ad" Dec 16 12:26:46.042648 kubelet[2858]: E1216 12:26:46.042591 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-ctclc" podUID="2053ff95-314d-4312-973b-a00f2cf38258" Dec 16 12:26:47.049426 kubelet[2858]: E1216 12:26:47.049328 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-tzbch" podUID="91bf0287-40fc-4bcc-aa76-a7888b9a94ef" Dec 16 12:26:48.042564 kubelet[2858]: E1216 12:26:48.042497 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b97fd59-4xf4h" podUID="6c115690-4c4b-4f0d-9563-71f7671c0428" Dec 16 12:26:54.048058 kubelet[2858]: E1216 12:26:54.047847 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5fd8987bcc-q5l6n" podUID="75b3ffde-bcfa-4d53-8c6e-f6a65e50e276" Dec 16 12:26:58.045225 kubelet[2858]: E1216 12:26:58.045161 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-psf64" podUID="3de277ef-70c9-4b08-8b83-d92a9680c7b8" Dec 16 12:26:59.041792 kubelet[2858]: E1216 12:26:59.041119 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-5kz8n" podUID="e5eca681-3514-4805-908a-df03e7d148ad" Dec 16 12:27:01.041959 kubelet[2858]: E1216 12:27:01.041906 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-ctclc" podUID="2053ff95-314d-4312-973b-a00f2cf38258" Dec 16 12:27:01.042871 kubelet[2858]: E1216 12:27:01.042375 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b97fd59-4xf4h" podUID="6c115690-4c4b-4f0d-9563-71f7671c0428" Dec 16 12:27:02.042201 kubelet[2858]: E1216 12:27:02.042109 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-tzbch" podUID="91bf0287-40fc-4bcc-aa76-a7888b9a94ef" Dec 16 12:27:07.043781 kubelet[2858]: E1216 12:27:07.043605 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5fd8987bcc-q5l6n" podUID="75b3ffde-bcfa-4d53-8c6e-f6a65e50e276" Dec 16 12:27:09.045376 kubelet[2858]: E1216 12:27:09.045168 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-psf64" podUID="3de277ef-70c9-4b08-8b83-d92a9680c7b8" Dec 16 12:27:11.044338 kubelet[2858]: E1216 12:27:11.043783 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-5kz8n" podUID="e5eca681-3514-4805-908a-df03e7d148ad" Dec 16 12:27:12.041456 kubelet[2858]: E1216 12:27:12.041065 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b97fd59-4xf4h" podUID="6c115690-4c4b-4f0d-9563-71f7671c0428" Dec 16 12:27:14.042538 containerd[1611]: time="2025-12-16T12:27:14.042485915Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:27:14.377316 containerd[1611]: time="2025-12-16T12:27:14.374591058Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:27:14.390346 containerd[1611]: time="2025-12-16T12:27:14.390292340Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:27:14.391166 containerd[1611]: time="2025-12-16T12:27:14.391098826Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:27:14.391596 kubelet[2858]: E1216 12:27:14.391559 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:27:14.391961 kubelet[2858]: E1216 12:27:14.391607 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:27:14.391961 kubelet[2858]: E1216 12:27:14.391681 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-79c4b989f5-ctclc_calico-apiserver(2053ff95-314d-4312-973b-a00f2cf38258): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:27:14.391961 kubelet[2858]: E1216 12:27:14.391710 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-ctclc" podUID="2053ff95-314d-4312-973b-a00f2cf38258" Dec 16 12:27:15.043999 containerd[1611]: time="2025-12-16T12:27:15.043568893Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Dec 16 12:27:15.393091 containerd[1611]: time="2025-12-16T12:27:15.392675234Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:27:15.394357 containerd[1611]: time="2025-12-16T12:27:15.394232966Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Dec 16 12:27:15.394357 containerd[1611]: time="2025-12-16T12:27:15.394283366Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Dec 16 12:27:15.394667 kubelet[2858]: E1216 12:27:15.394490 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:27:15.394667 kubelet[2858]: E1216 12:27:15.394538 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Dec 16 12:27:15.394667 kubelet[2858]: E1216 12:27:15.394609 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-79c4b989f5-tzbch_calico-apiserver(91bf0287-40fc-4bcc-aa76-a7888b9a94ef): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Dec 16 12:27:15.394667 kubelet[2858]: E1216 12:27:15.394638 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-tzbch" podUID="91bf0287-40fc-4bcc-aa76-a7888b9a94ef" Dec 16 12:27:20.043903 containerd[1611]: time="2025-12-16T12:27:20.043629804Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Dec 16 12:27:20.414510 containerd[1611]: time="2025-12-16T12:27:20.413730725Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:27:20.416422 containerd[1611]: time="2025-12-16T12:27:20.416357463Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Dec 16 12:27:20.417357 containerd[1611]: time="2025-12-16T12:27:20.416511945Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Dec 16 12:27:20.417450 kubelet[2858]: E1216 12:27:20.417418 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:27:20.417714 kubelet[2858]: E1216 12:27:20.417462 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Dec 16 12:27:20.417714 kubelet[2858]: E1216 12:27:20.417620 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-psf64_calico-system(3de277ef-70c9-4b08-8b83-d92a9680c7b8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Dec 16 12:27:20.418606 containerd[1611]: time="2025-12-16T12:27:20.418246717Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Dec 16 12:27:20.772218 containerd[1611]: time="2025-12-16T12:27:20.772169926Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:27:20.773649 containerd[1611]: time="2025-12-16T12:27:20.773596055Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Dec 16 12:27:20.773737 containerd[1611]: time="2025-12-16T12:27:20.773696256Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Dec 16 12:27:20.773995 kubelet[2858]: E1216 12:27:20.773950 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:27:20.774073 kubelet[2858]: E1216 12:27:20.774002 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Dec 16 12:27:20.774617 containerd[1611]: time="2025-12-16T12:27:20.774400221Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Dec 16 12:27:20.774734 kubelet[2858]: E1216 12:27:20.774581 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-5fd8987bcc-q5l6n_calico-system(75b3ffde-bcfa-4d53-8c6e-f6a65e50e276): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Dec 16 12:27:21.119790 containerd[1611]: time="2025-12-16T12:27:21.119590874Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:27:21.122161 containerd[1611]: time="2025-12-16T12:27:21.121860170Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Dec 16 12:27:21.122568 containerd[1611]: time="2025-12-16T12:27:21.121949930Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Dec 16 12:27:21.122759 kubelet[2858]: E1216 12:27:21.122699 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:27:21.123212 kubelet[2858]: E1216 12:27:21.122887 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Dec 16 12:27:21.123212 kubelet[2858]: E1216 12:27:21.123080 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-psf64_calico-system(3de277ef-70c9-4b08-8b83-d92a9680c7b8): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Dec 16 12:27:21.123212 kubelet[2858]: E1216 12:27:21.123140 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-psf64" podUID="3de277ef-70c9-4b08-8b83-d92a9680c7b8" Dec 16 12:27:21.123995 containerd[1611]: time="2025-12-16T12:27:21.123824503Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Dec 16 12:27:21.460743 containerd[1611]: time="2025-12-16T12:27:21.460681151Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:27:21.462198 containerd[1611]: time="2025-12-16T12:27:21.462146601Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Dec 16 12:27:21.462302 containerd[1611]: time="2025-12-16T12:27:21.462232522Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Dec 16 12:27:21.462467 kubelet[2858]: E1216 12:27:21.462426 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:27:21.462755 kubelet[2858]: E1216 12:27:21.462474 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Dec 16 12:27:21.462755 kubelet[2858]: E1216 12:27:21.462539 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-5fd8987bcc-q5l6n_calico-system(75b3ffde-bcfa-4d53-8c6e-f6a65e50e276): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Dec 16 12:27:21.462755 kubelet[2858]: E1216 12:27:21.462575 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5fd8987bcc-q5l6n" podUID="75b3ffde-bcfa-4d53-8c6e-f6a65e50e276" Dec 16 12:27:25.044262 containerd[1611]: time="2025-12-16T12:27:25.044041903Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Dec 16 12:27:25.412126 containerd[1611]: time="2025-12-16T12:27:25.411955629Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:27:25.413476 containerd[1611]: time="2025-12-16T12:27:25.413419478Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Dec 16 12:27:25.413585 containerd[1611]: time="2025-12-16T12:27:25.413525879Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Dec 16 12:27:25.414400 kubelet[2858]: E1216 12:27:25.414311 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:27:25.414400 kubelet[2858]: E1216 12:27:25.414371 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Dec 16 12:27:25.416762 kubelet[2858]: E1216 12:27:25.414445 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-5kz8n_calico-system(e5eca681-3514-4805-908a-df03e7d148ad): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Dec 16 12:27:25.416762 kubelet[2858]: E1216 12:27:25.414475 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-5kz8n" podUID="e5eca681-3514-4805-908a-df03e7d148ad" Dec 16 12:27:27.047815 containerd[1611]: time="2025-12-16T12:27:27.047766249Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Dec 16 12:27:27.400090 containerd[1611]: time="2025-12-16T12:27:27.399796639Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Dec 16 12:27:27.401436 containerd[1611]: time="2025-12-16T12:27:27.401376048Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Dec 16 12:27:27.401573 containerd[1611]: time="2025-12-16T12:27:27.401506009Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Dec 16 12:27:27.401792 kubelet[2858]: E1216 12:27:27.401739 2858 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:27:27.402315 kubelet[2858]: E1216 12:27:27.401808 2858 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Dec 16 12:27:27.402315 kubelet[2858]: E1216 12:27:27.401928 2858 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-6b97fd59-4xf4h_calico-system(6c115690-4c4b-4f0d-9563-71f7671c0428): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Dec 16 12:27:27.402315 kubelet[2858]: E1216 12:27:27.402020 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b97fd59-4xf4h" podUID="6c115690-4c4b-4f0d-9563-71f7671c0428" Dec 16 12:27:29.048442 kubelet[2858]: E1216 12:27:29.047083 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-tzbch" podUID="91bf0287-40fc-4bcc-aa76-a7888b9a94ef" Dec 16 12:27:29.048845 kubelet[2858]: E1216 12:27:29.048488 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-ctclc" podUID="2053ff95-314d-4312-973b-a00f2cf38258" Dec 16 12:27:34.044567 kubelet[2858]: E1216 12:27:34.044452 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5fd8987bcc-q5l6n" podUID="75b3ffde-bcfa-4d53-8c6e-f6a65e50e276" Dec 16 12:27:35.047833 kubelet[2858]: E1216 12:27:35.047784 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-psf64" podUID="3de277ef-70c9-4b08-8b83-d92a9680c7b8" Dec 16 12:27:40.042345 kubelet[2858]: E1216 12:27:40.041717 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-5kz8n" podUID="e5eca681-3514-4805-908a-df03e7d148ad" Dec 16 12:27:40.855523 systemd[1]: Started sshd@7-128.140.49.38:22-139.178.89.65:42862.service - OpenSSH per-connection server daemon (139.178.89.65:42862). Dec 16 12:27:40.860537 kernel: kauditd_printk_skb: 2 callbacks suppressed Dec 16 12:27:40.860643 kernel: audit: type=1130 audit(1765888060.854:737): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-128.140.49.38:22-139.178.89.65:42862 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:27:40.854000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-128.140.49.38:22-139.178.89.65:42862 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:27:41.044383 kubelet[2858]: E1216 12:27:41.044281 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b97fd59-4xf4h" podUID="6c115690-4c4b-4f0d-9563-71f7671c0428" Dec 16 12:27:41.765000 audit[5031]: USER_ACCT pid=5031 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:41.769889 sshd[5031]: Accepted publickey for core from 139.178.89.65 port 42862 ssh2: RSA SHA256:7GRNP2Xo+ztu9NygLqELCx+Z/yej5nUSnLKe9XvFTMI Dec 16 12:27:41.769000 audit[5031]: CRED_ACQ pid=5031 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:41.774492 kernel: audit: type=1101 audit(1765888061.765:738): pid=5031 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:41.774551 kernel: audit: type=1103 audit(1765888061.769:739): pid=5031 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:41.772728 sshd-session[5031]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:27:41.777145 kernel: audit: type=1006 audit(1765888061.769:740): pid=5031 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=8 res=1 Dec 16 12:27:41.769000 audit[5031]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff93e9d20 a2=3 a3=0 items=0 ppid=1 pid=5031 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=8 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:27:41.780515 kernel: audit: type=1300 audit(1765888061.769:740): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff93e9d20 a2=3 a3=0 items=0 ppid=1 pid=5031 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=8 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:27:41.769000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:27:41.785314 kernel: audit: type=1327 audit(1765888061.769:740): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:27:41.785183 systemd-logind[1580]: New session 8 of user core. Dec 16 12:27:41.793672 systemd[1]: Started session-8.scope - Session 8 of User core. Dec 16 12:27:41.797000 audit[5031]: USER_START pid=5031 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:41.802336 kernel: audit: type=1105 audit(1765888061.797:741): pid=5031 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:41.801000 audit[5035]: CRED_ACQ pid=5035 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:41.806351 kernel: audit: type=1103 audit(1765888061.801:742): pid=5035 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:42.391487 sshd[5035]: Connection closed by 139.178.89.65 port 42862 Dec 16 12:27:42.390917 sshd-session[5031]: pam_unix(sshd:session): session closed for user core Dec 16 12:27:42.390000 audit[5031]: USER_END pid=5031 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:42.397650 systemd[1]: sshd@7-128.140.49.38:22-139.178.89.65:42862.service: Deactivated successfully. Dec 16 12:27:42.401637 kernel: audit: type=1106 audit(1765888062.390:743): pid=5031 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:42.401783 kernel: audit: type=1104 audit(1765888062.391:744): pid=5031 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:42.391000 audit[5031]: CRED_DISP pid=5031 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:42.396000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-128.140.49.38:22-139.178.89.65:42862 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:27:42.403226 systemd[1]: session-8.scope: Deactivated successfully. Dec 16 12:27:42.404544 systemd-logind[1580]: Session 8 logged out. Waiting for processes to exit. Dec 16 12:27:42.408044 systemd-logind[1580]: Removed session 8. Dec 16 12:27:44.041468 kubelet[2858]: E1216 12:27:44.041094 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-tzbch" podUID="91bf0287-40fc-4bcc-aa76-a7888b9a94ef" Dec 16 12:27:44.042212 kubelet[2858]: E1216 12:27:44.042146 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-ctclc" podUID="2053ff95-314d-4312-973b-a00f2cf38258" Dec 16 12:27:46.043491 kubelet[2858]: E1216 12:27:46.043418 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-psf64" podUID="3de277ef-70c9-4b08-8b83-d92a9680c7b8" Dec 16 12:27:46.044133 kubelet[2858]: E1216 12:27:46.044053 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5fd8987bcc-q5l6n" podUID="75b3ffde-bcfa-4d53-8c6e-f6a65e50e276" Dec 16 12:27:47.579418 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:27:47.579520 kernel: audit: type=1130 audit(1765888067.575:746): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-128.140.49.38:22-139.178.89.65:42876 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:27:47.575000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-128.140.49.38:22-139.178.89.65:42876 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:27:47.576174 systemd[1]: Started sshd@8-128.140.49.38:22-139.178.89.65:42876.service - OpenSSH per-connection server daemon (139.178.89.65:42876). Dec 16 12:27:48.469000 audit[5051]: USER_ACCT pid=5051 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:48.474587 sshd[5051]: Accepted publickey for core from 139.178.89.65 port 42876 ssh2: RSA SHA256:7GRNP2Xo+ztu9NygLqELCx+Z/yej5nUSnLKe9XvFTMI Dec 16 12:27:48.472000 audit[5051]: CRED_ACQ pid=5051 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:48.477068 kernel: audit: type=1101 audit(1765888068.469:747): pid=5051 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:48.477145 kernel: audit: type=1103 audit(1765888068.472:748): pid=5051 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:48.475539 sshd-session[5051]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:27:48.479814 kernel: audit: type=1006 audit(1765888068.472:749): pid=5051 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=9 res=1 Dec 16 12:27:48.485764 kernel: audit: type=1300 audit(1765888068.472:749): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe996ecd0 a2=3 a3=0 items=0 ppid=1 pid=5051 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:27:48.472000 audit[5051]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe996ecd0 a2=3 a3=0 items=0 ppid=1 pid=5051 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:27:48.488149 kernel: audit: type=1327 audit(1765888068.472:749): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:27:48.472000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:27:48.491220 systemd-logind[1580]: New session 9 of user core. Dec 16 12:27:48.496526 systemd[1]: Started session-9.scope - Session 9 of User core. Dec 16 12:27:48.500000 audit[5051]: USER_START pid=5051 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:48.503000 audit[5054]: CRED_ACQ pid=5054 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:48.507101 kernel: audit: type=1105 audit(1765888068.500:750): pid=5051 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:48.507175 kernel: audit: type=1103 audit(1765888068.503:751): pid=5054 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:49.067372 sshd[5054]: Connection closed by 139.178.89.65 port 42876 Dec 16 12:27:49.068590 sshd-session[5051]: pam_unix(sshd:session): session closed for user core Dec 16 12:27:49.070000 audit[5051]: USER_END pid=5051 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:49.079699 systemd[1]: sshd@8-128.140.49.38:22-139.178.89.65:42876.service: Deactivated successfully. Dec 16 12:27:49.071000 audit[5051]: CRED_DISP pid=5051 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:49.085402 kernel: audit: type=1106 audit(1765888069.070:752): pid=5051 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:49.085502 kernel: audit: type=1104 audit(1765888069.071:753): pid=5051 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:49.083962 systemd[1]: session-9.scope: Deactivated successfully. Dec 16 12:27:49.078000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-128.140.49.38:22-139.178.89.65:42876 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:27:49.086843 systemd-logind[1580]: Session 9 logged out. Waiting for processes to exit. Dec 16 12:27:49.090574 systemd-logind[1580]: Removed session 9. Dec 16 12:27:52.040681 kubelet[2858]: E1216 12:27:52.040607 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-5kz8n" podUID="e5eca681-3514-4805-908a-df03e7d148ad" Dec 16 12:27:54.255363 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:27:54.255533 kernel: audit: type=1130 audit(1765888074.249:755): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-128.140.49.38:22-139.178.89.65:34680 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:27:54.249000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-128.140.49.38:22-139.178.89.65:34680 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:27:54.250509 systemd[1]: Started sshd@9-128.140.49.38:22-139.178.89.65:34680.service - OpenSSH per-connection server daemon (139.178.89.65:34680). Dec 16 12:27:55.044627 kubelet[2858]: E1216 12:27:55.044551 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-tzbch" podUID="91bf0287-40fc-4bcc-aa76-a7888b9a94ef" Dec 16 12:27:55.047570 kubelet[2858]: E1216 12:27:55.047509 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b97fd59-4xf4h" podUID="6c115690-4c4b-4f0d-9563-71f7671c0428" Dec 16 12:27:55.157000 audit[5095]: USER_ACCT pid=5095 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:55.158995 sshd[5095]: Accepted publickey for core from 139.178.89.65 port 34680 ssh2: RSA SHA256:7GRNP2Xo+ztu9NygLqELCx+Z/yej5nUSnLKe9XvFTMI Dec 16 12:27:55.160000 audit[5095]: CRED_ACQ pid=5095 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:55.162998 sshd-session[5095]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:27:55.164913 kernel: audit: type=1101 audit(1765888075.157:756): pid=5095 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:55.164966 kernel: audit: type=1103 audit(1765888075.160:757): pid=5095 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:55.168197 kernel: audit: type=1006 audit(1765888075.160:758): pid=5095 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=10 res=1 Dec 16 12:27:55.160000 audit[5095]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe907b690 a2=3 a3=0 items=0 ppid=1 pid=5095 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:27:55.171785 kernel: audit: type=1300 audit(1765888075.160:758): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe907b690 a2=3 a3=0 items=0 ppid=1 pid=5095 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:27:55.160000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:27:55.176911 kernel: audit: type=1327 audit(1765888075.160:758): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:27:55.177417 systemd-logind[1580]: New session 10 of user core. Dec 16 12:27:55.183713 systemd[1]: Started session-10.scope - Session 10 of User core. Dec 16 12:27:55.187000 audit[5095]: USER_START pid=5095 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:55.187000 audit[5101]: CRED_ACQ pid=5101 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:55.195637 kernel: audit: type=1105 audit(1765888075.187:759): pid=5095 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:55.195912 kernel: audit: type=1103 audit(1765888075.187:760): pid=5101 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:55.765502 sshd[5101]: Connection closed by 139.178.89.65 port 34680 Dec 16 12:27:55.766518 sshd-session[5095]: pam_unix(sshd:session): session closed for user core Dec 16 12:27:55.768000 audit[5095]: USER_END pid=5095 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:55.772882 systemd-logind[1580]: Session 10 logged out. Waiting for processes to exit. Dec 16 12:27:55.768000 audit[5095]: CRED_DISP pid=5095 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:55.773945 systemd[1]: sshd@9-128.140.49.38:22-139.178.89.65:34680.service: Deactivated successfully. Dec 16 12:27:55.776462 kernel: audit: type=1106 audit(1765888075.768:761): pid=5095 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:55.776536 kernel: audit: type=1104 audit(1765888075.768:762): pid=5095 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:55.772000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-128.140.49.38:22-139.178.89.65:34680 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:27:55.778526 systemd[1]: session-10.scope: Deactivated successfully. Dec 16 12:27:55.787859 systemd-logind[1580]: Removed session 10. Dec 16 12:27:55.942000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-128.140.49.38:22-139.178.89.65:34682 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:27:55.943800 systemd[1]: Started sshd@10-128.140.49.38:22-139.178.89.65:34682.service - OpenSSH per-connection server daemon (139.178.89.65:34682). Dec 16 12:27:56.824000 audit[5115]: USER_ACCT pid=5115 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:56.825554 sshd[5115]: Accepted publickey for core from 139.178.89.65 port 34682 ssh2: RSA SHA256:7GRNP2Xo+ztu9NygLqELCx+Z/yej5nUSnLKe9XvFTMI Dec 16 12:27:56.825000 audit[5115]: CRED_ACQ pid=5115 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:56.825000 audit[5115]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffff7cba30 a2=3 a3=0 items=0 ppid=1 pid=5115 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:27:56.825000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:27:56.827832 sshd-session[5115]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:27:56.833747 systemd-logind[1580]: New session 11 of user core. Dec 16 12:27:56.840468 systemd[1]: Started session-11.scope - Session 11 of User core. Dec 16 12:27:56.844000 audit[5115]: USER_START pid=5115 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:56.846000 audit[5118]: CRED_ACQ pid=5118 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:57.495329 sshd[5118]: Connection closed by 139.178.89.65 port 34682 Dec 16 12:27:57.494541 sshd-session[5115]: pam_unix(sshd:session): session closed for user core Dec 16 12:27:57.496000 audit[5115]: USER_END pid=5115 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:57.496000 audit[5115]: CRED_DISP pid=5115 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:57.501613 systemd[1]: session-11.scope: Deactivated successfully. Dec 16 12:27:57.502946 systemd[1]: sshd@10-128.140.49.38:22-139.178.89.65:34682.service: Deactivated successfully. Dec 16 12:27:57.502000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-128.140.49.38:22-139.178.89.65:34682 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:27:57.507598 systemd-logind[1580]: Session 11 logged out. Waiting for processes to exit. Dec 16 12:27:57.511920 systemd-logind[1580]: Removed session 11. Dec 16 12:27:57.687107 systemd[1]: Started sshd@11-128.140.49.38:22-139.178.89.65:34694.service - OpenSSH per-connection server daemon (139.178.89.65:34694). Dec 16 12:27:57.685000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-128.140.49.38:22-139.178.89.65:34694 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:27:58.043080 kubelet[2858]: E1216 12:27:58.043026 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-psf64" podUID="3de277ef-70c9-4b08-8b83-d92a9680c7b8" Dec 16 12:27:58.043691 kubelet[2858]: E1216 12:27:58.043650 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5fd8987bcc-q5l6n" podUID="75b3ffde-bcfa-4d53-8c6e-f6a65e50e276" Dec 16 12:27:58.598000 audit[5128]: USER_ACCT pid=5128 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:58.600050 sshd[5128]: Accepted publickey for core from 139.178.89.65 port 34694 ssh2: RSA SHA256:7GRNP2Xo+ztu9NygLqELCx+Z/yej5nUSnLKe9XvFTMI Dec 16 12:27:58.600000 audit[5128]: CRED_ACQ pid=5128 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:58.600000 audit[5128]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff6880bb0 a2=3 a3=0 items=0 ppid=1 pid=5128 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:27:58.600000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:27:58.602472 sshd-session[5128]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:27:58.609588 systemd-logind[1580]: New session 12 of user core. Dec 16 12:27:58.614493 systemd[1]: Started session-12.scope - Session 12 of User core. Dec 16 12:27:58.617000 audit[5128]: USER_START pid=5128 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:58.619000 audit[5131]: CRED_ACQ pid=5131 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:59.046843 kubelet[2858]: E1216 12:27:59.046789 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-ctclc" podUID="2053ff95-314d-4312-973b-a00f2cf38258" Dec 16 12:27:59.216474 sshd[5131]: Connection closed by 139.178.89.65 port 34694 Dec 16 12:27:59.217217 sshd-session[5128]: pam_unix(sshd:session): session closed for user core Dec 16 12:27:59.218000 audit[5128]: USER_END pid=5128 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:59.218000 audit[5128]: CRED_DISP pid=5128 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:27:59.223112 systemd-logind[1580]: Session 12 logged out. Waiting for processes to exit. Dec 16 12:27:59.223289 systemd[1]: sshd@11-128.140.49.38:22-139.178.89.65:34694.service: Deactivated successfully. Dec 16 12:27:59.222000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-128.140.49.38:22-139.178.89.65:34694 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:27:59.227897 systemd[1]: session-12.scope: Deactivated successfully. Dec 16 12:27:59.234648 systemd-logind[1580]: Removed session 12. Dec 16 12:28:04.393716 systemd[1]: Started sshd@12-128.140.49.38:22-139.178.89.65:35586.service - OpenSSH per-connection server daemon (139.178.89.65:35586). Dec 16 12:28:04.392000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-128.140.49.38:22-139.178.89.65:35586 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:28:04.396384 kernel: kauditd_printk_skb: 23 callbacks suppressed Dec 16 12:28:04.396507 kernel: audit: type=1130 audit(1765888084.392:782): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-128.140.49.38:22-139.178.89.65:35586 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:28:05.270000 audit[5143]: USER_ACCT pid=5143 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:05.272624 sshd[5143]: Accepted publickey for core from 139.178.89.65 port 35586 ssh2: RSA SHA256:7GRNP2Xo+ztu9NygLqELCx+Z/yej5nUSnLKe9XvFTMI Dec 16 12:28:05.278349 kernel: audit: type=1101 audit(1765888085.270:783): pid=5143 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:05.278412 kernel: audit: type=1103 audit(1765888085.276:784): pid=5143 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:05.276000 audit[5143]: CRED_ACQ pid=5143 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:05.278122 sshd-session[5143]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:28:05.282338 kernel: audit: type=1006 audit(1765888085.276:785): pid=5143 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=13 res=1 Dec 16 12:28:05.276000 audit[5143]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffffac74a0 a2=3 a3=0 items=0 ppid=1 pid=5143 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:28:05.285539 kernel: audit: type=1300 audit(1765888085.276:785): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffffac74a0 a2=3 a3=0 items=0 ppid=1 pid=5143 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:28:05.276000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:28:05.287369 kernel: audit: type=1327 audit(1765888085.276:785): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:28:05.291618 systemd-logind[1580]: New session 13 of user core. Dec 16 12:28:05.298690 systemd[1]: Started session-13.scope - Session 13 of User core. Dec 16 12:28:05.303000 audit[5143]: USER_START pid=5143 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:05.309327 kernel: audit: type=1105 audit(1765888085.303:786): pid=5143 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:05.307000 audit[5148]: CRED_ACQ pid=5148 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:05.312308 kernel: audit: type=1103 audit(1765888085.307:787): pid=5148 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:05.853993 sshd[5148]: Connection closed by 139.178.89.65 port 35586 Dec 16 12:28:05.854599 sshd-session[5143]: pam_unix(sshd:session): session closed for user core Dec 16 12:28:05.857000 audit[5143]: USER_END pid=5143 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:05.857000 audit[5143]: CRED_DISP pid=5143 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:05.863146 kernel: audit: type=1106 audit(1765888085.857:788): pid=5143 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:05.863247 kernel: audit: type=1104 audit(1765888085.857:789): pid=5143 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:05.864666 systemd[1]: sshd@12-128.140.49.38:22-139.178.89.65:35586.service: Deactivated successfully. Dec 16 12:28:05.863000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-128.140.49.38:22-139.178.89.65:35586 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:28:05.867718 systemd[1]: session-13.scope: Deactivated successfully. Dec 16 12:28:05.869158 systemd-logind[1580]: Session 13 logged out. Waiting for processes to exit. Dec 16 12:28:05.871467 systemd-logind[1580]: Removed session 13. Dec 16 12:28:06.029000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-128.140.49.38:22-139.178.89.65:35590 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:28:06.030665 systemd[1]: Started sshd@13-128.140.49.38:22-139.178.89.65:35590.service - OpenSSH per-connection server daemon (139.178.89.65:35590). Dec 16 12:28:06.902000 audit[5160]: USER_ACCT pid=5160 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:06.904484 sshd[5160]: Accepted publickey for core from 139.178.89.65 port 35590 ssh2: RSA SHA256:7GRNP2Xo+ztu9NygLqELCx+Z/yej5nUSnLKe9XvFTMI Dec 16 12:28:06.904000 audit[5160]: CRED_ACQ pid=5160 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:06.904000 audit[5160]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdddbd660 a2=3 a3=0 items=0 ppid=1 pid=5160 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:28:06.904000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:28:06.906146 sshd-session[5160]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:28:06.913914 systemd-logind[1580]: New session 14 of user core. Dec 16 12:28:06.918776 systemd[1]: Started session-14.scope - Session 14 of User core. Dec 16 12:28:06.922000 audit[5160]: USER_START pid=5160 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:06.924000 audit[5163]: CRED_ACQ pid=5163 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:07.047325 kubelet[2858]: E1216 12:28:07.046969 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-tzbch" podUID="91bf0287-40fc-4bcc-aa76-a7888b9a94ef" Dec 16 12:28:07.048515 kubelet[2858]: E1216 12:28:07.048373 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-5kz8n" podUID="e5eca681-3514-4805-908a-df03e7d148ad" Dec 16 12:28:07.641451 sshd[5163]: Connection closed by 139.178.89.65 port 35590 Dec 16 12:28:07.642556 sshd-session[5160]: pam_unix(sshd:session): session closed for user core Dec 16 12:28:07.643000 audit[5160]: USER_END pid=5160 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:07.644000 audit[5160]: CRED_DISP pid=5160 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:07.649651 systemd[1]: sshd@13-128.140.49.38:22-139.178.89.65:35590.service: Deactivated successfully. Dec 16 12:28:07.650000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-128.140.49.38:22-139.178.89.65:35590 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:28:07.656046 systemd[1]: session-14.scope: Deactivated successfully. Dec 16 12:28:07.663357 systemd-logind[1580]: Session 14 logged out. Waiting for processes to exit. Dec 16 12:28:07.664545 systemd-logind[1580]: Removed session 14. Dec 16 12:28:07.820000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-128.140.49.38:22-139.178.89.65:35600 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:28:07.821307 systemd[1]: Started sshd@14-128.140.49.38:22-139.178.89.65:35600.service - OpenSSH per-connection server daemon (139.178.89.65:35600). Dec 16 12:28:08.042900 kubelet[2858]: E1216 12:28:08.042855 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b97fd59-4xf4h" podUID="6c115690-4c4b-4f0d-9563-71f7671c0428" Dec 16 12:28:08.723000 audit[5173]: USER_ACCT pid=5173 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:08.724579 sshd[5173]: Accepted publickey for core from 139.178.89.65 port 35600 ssh2: RSA SHA256:7GRNP2Xo+ztu9NygLqELCx+Z/yej5nUSnLKe9XvFTMI Dec 16 12:28:08.727841 sshd-session[5173]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:28:08.726000 audit[5173]: CRED_ACQ pid=5173 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:08.726000 audit[5173]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff77654c0 a2=3 a3=0 items=0 ppid=1 pid=5173 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:28:08.726000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:28:08.738196 systemd-logind[1580]: New session 15 of user core. Dec 16 12:28:08.743583 systemd[1]: Started session-15.scope - Session 15 of User core. Dec 16 12:28:08.747000 audit[5173]: USER_START pid=5173 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:08.749000 audit[5176]: CRED_ACQ pid=5176 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:09.047896 kubelet[2858]: E1216 12:28:09.047728 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5fd8987bcc-q5l6n" podUID="75b3ffde-bcfa-4d53-8c6e-f6a65e50e276" Dec 16 12:28:10.043407 kubelet[2858]: E1216 12:28:10.042403 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-ctclc" podUID="2053ff95-314d-4312-973b-a00f2cf38258" Dec 16 12:28:10.043722 kubelet[2858]: E1216 12:28:10.043606 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-psf64" podUID="3de277ef-70c9-4b08-8b83-d92a9680c7b8" Dec 16 12:28:10.075000 audit[5188]: NETFILTER_CFG table=filter:143 family=2 entries=26 op=nft_register_rule pid=5188 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:28:10.076646 kernel: kauditd_printk_skb: 20 callbacks suppressed Dec 16 12:28:10.076712 kernel: audit: type=1325 audit(1765888090.075:806): table=filter:143 family=2 entries=26 op=nft_register_rule pid=5188 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:28:10.075000 audit[5188]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=ffffc8d7a3c0 a2=0 a3=1 items=0 ppid=2994 pid=5188 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:28:10.075000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:28:10.085739 kernel: audit: type=1300 audit(1765888090.075:806): arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=ffffc8d7a3c0 a2=0 a3=1 items=0 ppid=2994 pid=5188 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:28:10.085890 kernel: audit: type=1327 audit(1765888090.075:806): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:28:10.087000 audit[5188]: NETFILTER_CFG table=nat:144 family=2 entries=20 op=nft_register_rule pid=5188 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:28:10.090303 kernel: audit: type=1325 audit(1765888090.087:807): table=nat:144 family=2 entries=20 op=nft_register_rule pid=5188 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:28:10.087000 audit[5188]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffc8d7a3c0 a2=0 a3=1 items=0 ppid=2994 pid=5188 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:28:10.087000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:28:10.095409 kernel: audit: type=1300 audit(1765888090.087:807): arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffc8d7a3c0 a2=0 a3=1 items=0 ppid=2994 pid=5188 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:28:10.095547 kernel: audit: type=1327 audit(1765888090.087:807): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:28:10.188165 sshd[5176]: Connection closed by 139.178.89.65 port 35600 Dec 16 12:28:10.189546 sshd-session[5173]: pam_unix(sshd:session): session closed for user core Dec 16 12:28:10.192000 audit[5173]: USER_END pid=5173 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:10.192000 audit[5173]: CRED_DISP pid=5173 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:10.198436 kernel: audit: type=1106 audit(1765888090.192:808): pid=5173 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:10.198883 kernel: audit: type=1104 audit(1765888090.192:809): pid=5173 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:10.198853 systemd[1]: sshd@14-128.140.49.38:22-139.178.89.65:35600.service: Deactivated successfully. Dec 16 12:28:10.199000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-128.140.49.38:22-139.178.89.65:35600 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:28:10.201990 kernel: audit: type=1131 audit(1765888090.199:810): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-128.140.49.38:22-139.178.89.65:35600 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:28:10.205030 systemd[1]: session-15.scope: Deactivated successfully. Dec 16 12:28:10.208781 systemd-logind[1580]: Session 15 logged out. Waiting for processes to exit. Dec 16 12:28:10.209895 systemd-logind[1580]: Removed session 15. Dec 16 12:28:10.372000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-128.140.49.38:22-139.178.89.65:35602 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:28:10.372416 systemd[1]: Started sshd@15-128.140.49.38:22-139.178.89.65:35602.service - OpenSSH per-connection server daemon (139.178.89.65:35602). Dec 16 12:28:10.377307 kernel: audit: type=1130 audit(1765888090.372:811): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-128.140.49.38:22-139.178.89.65:35602 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:28:11.112000 audit[5197]: NETFILTER_CFG table=filter:145 family=2 entries=38 op=nft_register_rule pid=5197 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:28:11.112000 audit[5197]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=ffffd7447d30 a2=0 a3=1 items=0 ppid=2994 pid=5197 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:28:11.112000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:28:11.116000 audit[5197]: NETFILTER_CFG table=nat:146 family=2 entries=20 op=nft_register_rule pid=5197 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:28:11.116000 audit[5197]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffd7447d30 a2=0 a3=1 items=0 ppid=2994 pid=5197 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:28:11.116000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:28:11.276000 audit[5193]: USER_ACCT pid=5193 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:11.277445 sshd[5193]: Accepted publickey for core from 139.178.89.65 port 35602 ssh2: RSA SHA256:7GRNP2Xo+ztu9NygLqELCx+Z/yej5nUSnLKe9XvFTMI Dec 16 12:28:11.278000 audit[5193]: CRED_ACQ pid=5193 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:11.278000 audit[5193]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe4614930 a2=3 a3=0 items=0 ppid=1 pid=5193 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:28:11.278000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:28:11.279114 sshd-session[5193]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:28:11.287033 systemd-logind[1580]: New session 16 of user core. Dec 16 12:28:11.294561 systemd[1]: Started session-16.scope - Session 16 of User core. Dec 16 12:28:11.298000 audit[5193]: USER_START pid=5193 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:11.300000 audit[5198]: CRED_ACQ pid=5198 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:12.115607 sshd[5198]: Connection closed by 139.178.89.65 port 35602 Dec 16 12:28:12.118186 sshd-session[5193]: pam_unix(sshd:session): session closed for user core Dec 16 12:28:12.119000 audit[5193]: USER_END pid=5193 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:12.120000 audit[5193]: CRED_DISP pid=5193 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:12.124957 systemd[1]: sshd@15-128.140.49.38:22-139.178.89.65:35602.service: Deactivated successfully. Dec 16 12:28:12.125000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-128.140.49.38:22-139.178.89.65:35602 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:28:12.128649 systemd[1]: session-16.scope: Deactivated successfully. Dec 16 12:28:12.130494 systemd-logind[1580]: Session 16 logged out. Waiting for processes to exit. Dec 16 12:28:12.132932 systemd-logind[1580]: Removed session 16. Dec 16 12:28:12.296000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-128.140.49.38:22-139.178.89.65:58880 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:28:12.296604 systemd[1]: Started sshd@16-128.140.49.38:22-139.178.89.65:58880.service - OpenSSH per-connection server daemon (139.178.89.65:58880). Dec 16 12:28:13.198000 audit[5212]: USER_ACCT pid=5212 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:13.200893 sshd[5212]: Accepted publickey for core from 139.178.89.65 port 58880 ssh2: RSA SHA256:7GRNP2Xo+ztu9NygLqELCx+Z/yej5nUSnLKe9XvFTMI Dec 16 12:28:13.202000 audit[5212]: CRED_ACQ pid=5212 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:13.202000 audit[5212]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd4f86ca0 a2=3 a3=0 items=0 ppid=1 pid=5212 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:28:13.202000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:28:13.204514 sshd-session[5212]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:28:13.213631 systemd-logind[1580]: New session 17 of user core. Dec 16 12:28:13.218580 systemd[1]: Started session-17.scope - Session 17 of User core. Dec 16 12:28:13.222000 audit[5212]: USER_START pid=5212 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:13.225000 audit[5215]: CRED_ACQ pid=5215 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:13.788741 sshd[5215]: Connection closed by 139.178.89.65 port 58880 Dec 16 12:28:13.788625 sshd-session[5212]: pam_unix(sshd:session): session closed for user core Dec 16 12:28:13.790000 audit[5212]: USER_END pid=5212 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:13.791000 audit[5212]: CRED_DISP pid=5212 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:13.794980 systemd[1]: sshd@16-128.140.49.38:22-139.178.89.65:58880.service: Deactivated successfully. Dec 16 12:28:13.795000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-128.140.49.38:22-139.178.89.65:58880 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:28:13.798936 systemd[1]: session-17.scope: Deactivated successfully. Dec 16 12:28:13.801343 systemd-logind[1580]: Session 17 logged out. Waiting for processes to exit. Dec 16 12:28:13.803466 systemd-logind[1580]: Removed session 17. Dec 16 12:28:14.595000 audit[5227]: NETFILTER_CFG table=filter:147 family=2 entries=26 op=nft_register_rule pid=5227 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:28:14.595000 audit[5227]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffc83f9910 a2=0 a3=1 items=0 ppid=2994 pid=5227 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:28:14.595000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:28:14.603000 audit[5227]: NETFILTER_CFG table=nat:148 family=2 entries=104 op=nft_register_chain pid=5227 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Dec 16 12:28:14.603000 audit[5227]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=48684 a0=3 a1=ffffc83f9910 a2=0 a3=1 items=0 ppid=2994 pid=5227 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:28:14.603000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Dec 16 12:28:18.976980 systemd[1]: Started sshd@17-128.140.49.38:22-139.178.89.65:58886.service - OpenSSH per-connection server daemon (139.178.89.65:58886). Dec 16 12:28:18.981423 kernel: kauditd_printk_skb: 33 callbacks suppressed Dec 16 12:28:18.981510 kernel: audit: type=1130 audit(1765888098.976:833): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-128.140.49.38:22-139.178.89.65:58886 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:28:18.976000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-128.140.49.38:22-139.178.89.65:58886 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:28:19.043005 kubelet[2858]: E1216 12:28:19.042948 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-5kz8n" podUID="e5eca681-3514-4805-908a-df03e7d148ad" Dec 16 12:28:19.893000 audit[5229]: USER_ACCT pid=5229 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:19.897155 sshd[5229]: Accepted publickey for core from 139.178.89.65 port 58886 ssh2: RSA SHA256:7GRNP2Xo+ztu9NygLqELCx+Z/yej5nUSnLKe9XvFTMI Dec 16 12:28:19.898300 kernel: audit: type=1101 audit(1765888099.893:834): pid=5229 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:19.897000 audit[5229]: CRED_ACQ pid=5229 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:19.898859 sshd-session[5229]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:28:19.903346 kernel: audit: type=1103 audit(1765888099.897:835): pid=5229 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:19.903456 kernel: audit: type=1006 audit(1765888099.897:836): pid=5229 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=18 res=1 Dec 16 12:28:19.897000 audit[5229]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcebc2ad0 a2=3 a3=0 items=0 ppid=1 pid=5229 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:28:19.906152 kernel: audit: type=1300 audit(1765888099.897:836): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcebc2ad0 a2=3 a3=0 items=0 ppid=1 pid=5229 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:28:19.897000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:28:19.907709 kernel: audit: type=1327 audit(1765888099.897:836): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:28:19.912135 systemd-logind[1580]: New session 18 of user core. Dec 16 12:28:19.915577 systemd[1]: Started session-18.scope - Session 18 of User core. Dec 16 12:28:19.921000 audit[5229]: USER_START pid=5229 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:19.930313 kernel: audit: type=1105 audit(1765888099.921:837): pid=5229 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:19.930000 audit[5232]: CRED_ACQ pid=5232 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:19.937296 kernel: audit: type=1103 audit(1765888099.930:838): pid=5232 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:20.043069 kubelet[2858]: E1216 12:28:20.042994 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-tzbch" podUID="91bf0287-40fc-4bcc-aa76-a7888b9a94ef" Dec 16 12:28:20.533092 sshd[5232]: Connection closed by 139.178.89.65 port 58886 Dec 16 12:28:20.533690 sshd-session[5229]: pam_unix(sshd:session): session closed for user core Dec 16 12:28:20.533000 audit[5229]: USER_END pid=5229 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:20.544131 kernel: audit: type=1106 audit(1765888100.533:839): pid=5229 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:20.544333 kernel: audit: type=1104 audit(1765888100.534:840): pid=5229 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:20.534000 audit[5229]: CRED_DISP pid=5229 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:20.543015 systemd[1]: sshd@17-128.140.49.38:22-139.178.89.65:58886.service: Deactivated successfully. Dec 16 12:28:20.543000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-128.140.49.38:22-139.178.89.65:58886 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:28:20.548158 systemd[1]: session-18.scope: Deactivated successfully. Dec 16 12:28:20.550411 systemd-logind[1580]: Session 18 logged out. Waiting for processes to exit. Dec 16 12:28:20.551594 systemd-logind[1580]: Removed session 18. Dec 16 12:28:21.042465 kubelet[2858]: E1216 12:28:21.042088 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-5fd8987bcc-q5l6n" podUID="75b3ffde-bcfa-4d53-8c6e-f6a65e50e276" Dec 16 12:28:22.042002 kubelet[2858]: E1216 12:28:22.041955 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-6b97fd59-4xf4h" podUID="6c115690-4c4b-4f0d-9563-71f7671c0428" Dec 16 12:28:24.041534 kubelet[2858]: E1216 12:28:24.041410 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-ctclc" podUID="2053ff95-314d-4312-973b-a00f2cf38258" Dec 16 12:28:25.044462 kubelet[2858]: E1216 12:28:25.044393 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-psf64" podUID="3de277ef-70c9-4b08-8b83-d92a9680c7b8" Dec 16 12:28:25.717640 systemd[1]: Started sshd@18-128.140.49.38:22-139.178.89.65:46766.service - OpenSSH per-connection server daemon (139.178.89.65:46766). Dec 16 12:28:25.720803 kernel: kauditd_printk_skb: 1 callbacks suppressed Dec 16 12:28:25.720841 kernel: audit: type=1130 audit(1765888105.716:842): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-128.140.49.38:22-139.178.89.65:46766 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:28:25.716000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-128.140.49.38:22-139.178.89.65:46766 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:28:26.613000 audit[5267]: USER_ACCT pid=5267 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:26.619544 sshd[5267]: Accepted publickey for core from 139.178.89.65 port 46766 ssh2: RSA SHA256:7GRNP2Xo+ztu9NygLqELCx+Z/yej5nUSnLKe9XvFTMI Dec 16 12:28:26.618000 audit[5267]: CRED_ACQ pid=5267 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:26.623096 kernel: audit: type=1101 audit(1765888106.613:843): pid=5267 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_unix,pam_faillock,pam_permit acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:26.623190 kernel: audit: type=1103 audit(1765888106.618:844): pid=5267 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:26.620158 sshd-session[5267]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Dec 16 12:28:26.624865 kernel: audit: type=1006 audit(1765888106.618:845): pid=5267 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=19 res=1 Dec 16 12:28:26.618000 audit[5267]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdd3cc0e0 a2=3 a3=0 items=0 ppid=1 pid=5267 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:28:26.618000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:28:26.630808 kernel: audit: type=1300 audit(1765888106.618:845): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffdd3cc0e0 a2=3 a3=0 items=0 ppid=1 pid=5267 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Dec 16 12:28:26.630909 kernel: audit: type=1327 audit(1765888106.618:845): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Dec 16 12:28:26.634735 systemd-logind[1580]: New session 19 of user core. Dec 16 12:28:26.639792 systemd[1]: Started session-19.scope - Session 19 of User core. Dec 16 12:28:26.643000 audit[5267]: USER_START pid=5267 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:26.648000 audit[5270]: CRED_ACQ pid=5270 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:26.652956 kernel: audit: type=1105 audit(1765888106.643:846): pid=5267 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:26.653019 kernel: audit: type=1103 audit(1765888106.648:847): pid=5270 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:27.237753 sshd[5270]: Connection closed by 139.178.89.65 port 46766 Dec 16 12:28:27.239046 sshd-session[5267]: pam_unix(sshd:session): session closed for user core Dec 16 12:28:27.239000 audit[5267]: USER_END pid=5267 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:27.248331 systemd-logind[1580]: Session 19 logged out. Waiting for processes to exit. Dec 16 12:28:27.251061 systemd[1]: sshd@18-128.140.49.38:22-139.178.89.65:46766.service: Deactivated successfully. Dec 16 12:28:27.253792 kernel: audit: type=1106 audit(1765888107.239:848): pid=5267 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_lastlog,pam_limits,pam_env,pam_unix,pam_permit,pam_systemd,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:27.253871 kernel: audit: type=1104 audit(1765888107.240:849): pid=5267 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:27.240000 audit[5267]: CRED_DISP pid=5267 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock,pam_unix acct="core" exe="/usr/lib64/misc/sshd-session" hostname=139.178.89.65 addr=139.178.89.65 terminal=ssh res=success' Dec 16 12:28:27.256000 systemd[1]: session-19.scope: Deactivated successfully. Dec 16 12:28:27.252000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-128.140.49.38:22-139.178.89.65:46766 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Dec 16 12:28:27.262344 systemd-logind[1580]: Removed session 19. Dec 16 12:28:31.042339 kubelet[2858]: E1216 12:28:31.042286 2858 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-79c4b989f5-tzbch" podUID="91bf0287-40fc-4bcc-aa76-a7888b9a94ef"