Jan 16 17:56:49.415474 kernel: Booting Linux on physical CPU 0x0000000000 [0x413fd0c1] Jan 16 17:56:49.415497 kernel: Linux version 6.12.65-flatcar (build@pony-truck.infra.kinvolk.io) (aarch64-cros-linux-gnu-gcc (Gentoo Hardened 14.3.1_p20250801 p4) 14.3.1 20250801, GNU ld (Gentoo 2.45 p3) 2.45.0) #1 SMP PREEMPT Fri Jan 16 03:04:27 -00 2026 Jan 16 17:56:49.415508 kernel: KASLR enabled Jan 16 17:56:49.415514 kernel: efi: EFI v2.7 by Ubuntu distribution of EDK II Jan 16 17:56:49.415520 kernel: efi: SMBIOS 3.0=0x139ed0000 MEMATTR=0x1390b8118 ACPI 2.0=0x136760018 RNG=0x13676e918 MEMRESERVE=0x136b41218 Jan 16 17:56:49.415525 kernel: random: crng init done Jan 16 17:56:49.415533 kernel: secureboot: Secure boot disabled Jan 16 17:56:49.415539 kernel: ACPI: Early table checksum verification disabled Jan 16 17:56:49.415545 kernel: ACPI: RSDP 0x0000000136760018 000024 (v02 BOCHS ) Jan 16 17:56:49.416651 kernel: ACPI: XSDT 0x000000013676FE98 00006C (v01 BOCHS BXPC 00000001 01000013) Jan 16 17:56:49.416658 kernel: ACPI: FACP 0x000000013676FA98 000114 (v06 BOCHS BXPC 00000001 BXPC 00000001) Jan 16 17:56:49.416665 kernel: ACPI: DSDT 0x0000000136767518 001468 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 16 17:56:49.416671 kernel: ACPI: APIC 0x000000013676FC18 000108 (v04 BOCHS BXPC 00000001 BXPC 00000001) Jan 16 17:56:49.416677 kernel: ACPI: PPTT 0x000000013676FD98 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 16 17:56:49.416687 kernel: ACPI: GTDT 0x000000013676D898 000060 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 16 17:56:49.416693 kernel: ACPI: MCFG 0x000000013676FF98 00003C (v01 BOCHS BXPC 00000001 BXPC 00000001) Jan 16 17:56:49.416700 kernel: ACPI: SPCR 0x000000013676E818 000050 (v02 BOCHS BXPC 00000001 BXPC 00000001) Jan 16 17:56:49.416707 kernel: ACPI: DBG2 0x000000013676E898 000057 (v00 BOCHS BXPC 00000001 BXPC 00000001) Jan 16 17:56:49.416713 kernel: ACPI: IORT 0x000000013676E418 000080 (v03 BOCHS BXPC 00000001 BXPC 00000001) Jan 16 17:56:49.416720 kernel: ACPI: BGRT 0x000000013676E798 000038 (v01 INTEL EDK2 00000002 01000013) Jan 16 17:56:49.416726 kernel: ACPI: SPCR: console: pl011,mmio32,0x9000000,9600 Jan 16 17:56:49.416733 kernel: ACPI: Use ACPI SPCR as default console: Yes Jan 16 17:56:49.416739 kernel: NUMA: Faking a node at [mem 0x0000000040000000-0x0000000139ffffff] Jan 16 17:56:49.416747 kernel: NODE_DATA(0) allocated [mem 0x13967da00-0x139684fff] Jan 16 17:56:49.416754 kernel: Zone ranges: Jan 16 17:56:49.416760 kernel: DMA [mem 0x0000000040000000-0x00000000ffffffff] Jan 16 17:56:49.416766 kernel: DMA32 empty Jan 16 17:56:49.416773 kernel: Normal [mem 0x0000000100000000-0x0000000139ffffff] Jan 16 17:56:49.416779 kernel: Device empty Jan 16 17:56:49.416786 kernel: Movable zone start for each node Jan 16 17:56:49.416792 kernel: Early memory node ranges Jan 16 17:56:49.416799 kernel: node 0: [mem 0x0000000040000000-0x000000013666ffff] Jan 16 17:56:49.416805 kernel: node 0: [mem 0x0000000136670000-0x000000013667ffff] Jan 16 17:56:49.416812 kernel: node 0: [mem 0x0000000136680000-0x000000013676ffff] Jan 16 17:56:49.416818 kernel: node 0: [mem 0x0000000136770000-0x0000000136b3ffff] Jan 16 17:56:49.416826 kernel: node 0: [mem 0x0000000136b40000-0x0000000139e1ffff] Jan 16 17:56:49.416832 kernel: node 0: [mem 0x0000000139e20000-0x0000000139eaffff] Jan 16 17:56:49.416838 kernel: node 0: [mem 0x0000000139eb0000-0x0000000139ebffff] Jan 16 17:56:49.416845 kernel: node 0: [mem 0x0000000139ec0000-0x0000000139fdffff] Jan 16 17:56:49.416852 kernel: node 0: [mem 0x0000000139fe0000-0x0000000139ffffff] Jan 16 17:56:49.416861 kernel: Initmem setup node 0 [mem 0x0000000040000000-0x0000000139ffffff] Jan 16 17:56:49.416869 kernel: On node 0, zone Normal: 24576 pages in unavailable ranges Jan 16 17:56:49.416876 kernel: cma: Reserved 16 MiB at 0x00000000fca00000 on node -1 Jan 16 17:56:49.416883 kernel: psci: probing for conduit method from ACPI. Jan 16 17:56:49.416890 kernel: psci: PSCIv1.1 detected in firmware. Jan 16 17:56:49.416897 kernel: psci: Using standard PSCI v0.2 function IDs Jan 16 17:56:49.416904 kernel: psci: Trusted OS migration not required Jan 16 17:56:49.416910 kernel: psci: SMC Calling Convention v1.1 Jan 16 17:56:49.416917 kernel: smccc: KVM: hypervisor services detected (0x00000000 0x00000000 0x00000000 0x00000003) Jan 16 17:56:49.416926 kernel: percpu: Embedded 33 pages/cpu s98200 r8192 d28776 u135168 Jan 16 17:56:49.416947 kernel: pcpu-alloc: s98200 r8192 d28776 u135168 alloc=33*4096 Jan 16 17:56:49.416955 kernel: pcpu-alloc: [0] 0 [0] 1 Jan 16 17:56:49.416963 kernel: Detected PIPT I-cache on CPU0 Jan 16 17:56:49.416969 kernel: CPU features: detected: GIC system register CPU interface Jan 16 17:56:49.416976 kernel: CPU features: detected: Spectre-v4 Jan 16 17:56:49.416983 kernel: CPU features: detected: Spectre-BHB Jan 16 17:56:49.416990 kernel: CPU features: kernel page table isolation forced ON by KASLR Jan 16 17:56:49.416997 kernel: CPU features: detected: Kernel page table isolation (KPTI) Jan 16 17:56:49.417004 kernel: CPU features: detected: ARM erratum 1418040 Jan 16 17:56:49.417011 kernel: CPU features: detected: SSBS not fully self-synchronizing Jan 16 17:56:49.417020 kernel: alternatives: applying boot alternatives Jan 16 17:56:49.417029 kernel: Kernel command line: BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=924fb3eb04ba1d8edcb66284d30e3342855b0579b62556e7722bcf37e82bda13 Jan 16 17:56:49.417036 kernel: Dentry cache hash table entries: 524288 (order: 10, 4194304 bytes, linear) Jan 16 17:56:49.417043 kernel: Inode-cache hash table entries: 262144 (order: 9, 2097152 bytes, linear) Jan 16 17:56:49.417050 kernel: Fallback order for Node 0: 0 Jan 16 17:56:49.417057 kernel: Built 1 zonelists, mobility grouping on. Total pages: 1024000 Jan 16 17:56:49.417063 kernel: Policy zone: Normal Jan 16 17:56:49.417070 kernel: mem auto-init: stack:off, heap alloc:off, heap free:off Jan 16 17:56:49.417077 kernel: software IO TLB: area num 2. Jan 16 17:56:49.417084 kernel: software IO TLB: mapped [mem 0x00000000f8a00000-0x00000000fca00000] (64MB) Jan 16 17:56:49.417093 kernel: SLUB: HWalign=64, Order=0-3, MinObjects=0, CPUs=2, Nodes=1 Jan 16 17:56:49.417100 kernel: rcu: Preemptible hierarchical RCU implementation. Jan 16 17:56:49.417107 kernel: rcu: RCU event tracing is enabled. Jan 16 17:56:49.417114 kernel: rcu: RCU restricting CPUs from NR_CPUS=512 to nr_cpu_ids=2. Jan 16 17:56:49.417121 kernel: Trampoline variant of Tasks RCU enabled. Jan 16 17:56:49.417128 kernel: Tracing variant of Tasks RCU enabled. Jan 16 17:56:49.417135 kernel: rcu: RCU calculated value of scheduler-enlistment delay is 100 jiffies. Jan 16 17:56:49.417142 kernel: rcu: Adjusting geometry for rcu_fanout_leaf=16, nr_cpu_ids=2 Jan 16 17:56:49.417149 kernel: RCU Tasks: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 16 17:56:49.417156 kernel: RCU Tasks Trace: Setting shift to 1 and lim to 1 rcu_task_cb_adjust=1 rcu_task_cpu_ids=2. Jan 16 17:56:49.417163 kernel: NR_IRQS: 64, nr_irqs: 64, preallocated irqs: 0 Jan 16 17:56:49.417171 kernel: GICv3: 256 SPIs implemented Jan 16 17:56:49.417178 kernel: GICv3: 0 Extended SPIs implemented Jan 16 17:56:49.417185 kernel: Root IRQ handler: gic_handle_irq Jan 16 17:56:49.417192 kernel: GICv3: GICv3 features: 16 PPIs, DirectLPI Jan 16 17:56:49.417199 kernel: GICv3: GICD_CTRL.DS=1, SCR_EL3.FIQ=0 Jan 16 17:56:49.417206 kernel: GICv3: CPU0: found redistributor 0 region 0:0x00000000080a0000 Jan 16 17:56:49.417213 kernel: ITS [mem 0x08080000-0x0809ffff] Jan 16 17:56:49.417219 kernel: ITS@0x0000000008080000: allocated 8192 Devices @100100000 (indirect, esz 8, psz 64K, shr 1) Jan 16 17:56:49.417227 kernel: ITS@0x0000000008080000: allocated 8192 Interrupt Collections @100110000 (flat, esz 8, psz 64K, shr 1) Jan 16 17:56:49.417234 kernel: GICv3: using LPI property table @0x0000000100120000 Jan 16 17:56:49.417241 kernel: GICv3: CPU0: using allocated LPI pending table @0x0000000100130000 Jan 16 17:56:49.417249 kernel: rcu: srcu_init: Setting srcu_struct sizes based on contention. Jan 16 17:56:49.417256 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 16 17:56:49.417263 kernel: arch_timer: cp15 timer(s) running at 25.00MHz (virt). Jan 16 17:56:49.417270 kernel: clocksource: arch_sys_counter: mask: 0xffffffffffffff max_cycles: 0x5c40939b5, max_idle_ns: 440795202646 ns Jan 16 17:56:49.417277 kernel: sched_clock: 56 bits at 25MHz, resolution 40ns, wraps every 4398046511100ns Jan 16 17:56:49.417284 kernel: Console: colour dummy device 80x25 Jan 16 17:56:49.417292 kernel: ACPI: Core revision 20240827 Jan 16 17:56:49.417299 kernel: Calibrating delay loop (skipped), value calculated using timer frequency.. 50.00 BogoMIPS (lpj=25000) Jan 16 17:56:49.417307 kernel: pid_max: default: 32768 minimum: 301 Jan 16 17:56:49.417316 kernel: LSM: initializing lsm=lockdown,capability,landlock,selinux,ima Jan 16 17:56:49.417323 kernel: landlock: Up and running. Jan 16 17:56:49.417330 kernel: SELinux: Initializing. Jan 16 17:56:49.417338 kernel: Mount-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 16 17:56:49.417345 kernel: Mountpoint-cache hash table entries: 8192 (order: 4, 65536 bytes, linear) Jan 16 17:56:49.417353 kernel: rcu: Hierarchical SRCU implementation. Jan 16 17:56:49.417360 kernel: rcu: Max phase no-delay instances is 400. Jan 16 17:56:49.417368 kernel: Timer migration: 1 hierarchy levels; 8 children per group; 1 crossnode level Jan 16 17:56:49.417376 kernel: Remapping and enabling EFI services. Jan 16 17:56:49.417384 kernel: smp: Bringing up secondary CPUs ... Jan 16 17:56:49.417391 kernel: Detected PIPT I-cache on CPU1 Jan 16 17:56:49.417398 kernel: GICv3: CPU1: found redistributor 1 region 0:0x00000000080c0000 Jan 16 17:56:49.417406 kernel: GICv3: CPU1: using allocated LPI pending table @0x0000000100140000 Jan 16 17:56:49.417413 kernel: arch_timer: Enabling local workaround for ARM erratum 1418040 Jan 16 17:56:49.417420 kernel: CPU1: Booted secondary processor 0x0000000001 [0x413fd0c1] Jan 16 17:56:49.417429 kernel: smp: Brought up 1 node, 2 CPUs Jan 16 17:56:49.417437 kernel: SMP: Total of 2 processors activated. Jan 16 17:56:49.417449 kernel: CPU: All CPU(s) started at EL1 Jan 16 17:56:49.417458 kernel: CPU features: detected: 32-bit EL0 Support Jan 16 17:56:49.417466 kernel: CPU features: detected: Data cache clean to the PoU not required for I/D coherence Jan 16 17:56:49.417473 kernel: CPU features: detected: Common not Private translations Jan 16 17:56:49.417481 kernel: CPU features: detected: CRC32 instructions Jan 16 17:56:49.417489 kernel: CPU features: detected: Enhanced Virtualization Traps Jan 16 17:56:49.417498 kernel: CPU features: detected: RCpc load-acquire (LDAPR) Jan 16 17:56:49.417505 kernel: CPU features: detected: LSE atomic instructions Jan 16 17:56:49.417513 kernel: CPU features: detected: Privileged Access Never Jan 16 17:56:49.417521 kernel: CPU features: detected: RAS Extension Support Jan 16 17:56:49.417528 kernel: CPU features: detected: Speculative Store Bypassing Safe (SSBS) Jan 16 17:56:49.417538 kernel: alternatives: applying system-wide alternatives Jan 16 17:56:49.417555 kernel: CPU features: detected: Hardware dirty bit management on CPU0-1 Jan 16 17:56:49.417565 kernel: Memory: 3885860K/4096000K available (11200K kernel code, 2458K rwdata, 9092K rodata, 12480K init, 1038K bss, 188660K reserved, 16384K cma-reserved) Jan 16 17:56:49.417574 kernel: devtmpfs: initialized Jan 16 17:56:49.417581 kernel: clocksource: jiffies: mask: 0xffffffff max_cycles: 0xffffffff, max_idle_ns: 1911260446275000 ns Jan 16 17:56:49.417589 kernel: futex hash table entries: 512 (order: 3, 32768 bytes, linear) Jan 16 17:56:49.417597 kernel: 2G module region forced by RANDOMIZE_MODULE_REGION_FULL Jan 16 17:56:49.417605 kernel: 0 pages in range for non-PLT usage Jan 16 17:56:49.417614 kernel: 515152 pages in range for PLT usage Jan 16 17:56:49.417622 kernel: pinctrl core: initialized pinctrl subsystem Jan 16 17:56:49.417629 kernel: SMBIOS 3.0.0 present. Jan 16 17:56:49.417637 kernel: DMI: Hetzner vServer/KVM Virtual Machine, BIOS 20171111 11/11/2017 Jan 16 17:56:49.417645 kernel: DMI: Memory slots populated: 1/1 Jan 16 17:56:49.417652 kernel: NET: Registered PF_NETLINK/PF_ROUTE protocol family Jan 16 17:56:49.417660 kernel: DMA: preallocated 512 KiB GFP_KERNEL pool for atomic allocations Jan 16 17:56:49.417669 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA pool for atomic allocations Jan 16 17:56:49.417677 kernel: DMA: preallocated 512 KiB GFP_KERNEL|GFP_DMA32 pool for atomic allocations Jan 16 17:56:49.417685 kernel: audit: initializing netlink subsys (disabled) Jan 16 17:56:49.417692 kernel: audit: type=2000 audit(0.011:1): state=initialized audit_enabled=0 res=1 Jan 16 17:56:49.417700 kernel: thermal_sys: Registered thermal governor 'step_wise' Jan 16 17:56:49.417707 kernel: cpuidle: using governor menu Jan 16 17:56:49.417715 kernel: hw-breakpoint: found 6 breakpoint and 4 watchpoint registers. Jan 16 17:56:49.417724 kernel: ASID allocator initialised with 32768 entries Jan 16 17:56:49.417731 kernel: acpiphp: ACPI Hot Plug PCI Controller Driver version: 0.5 Jan 16 17:56:49.417739 kernel: Serial: AMBA PL011 UART driver Jan 16 17:56:49.417747 kernel: HugeTLB: registered 1.00 GiB page size, pre-allocated 0 pages Jan 16 17:56:49.417754 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 1.00 GiB page Jan 16 17:56:49.417762 kernel: HugeTLB: registered 32.0 MiB page size, pre-allocated 0 pages Jan 16 17:56:49.417769 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 32.0 MiB page Jan 16 17:56:49.417778 kernel: HugeTLB: registered 2.00 MiB page size, pre-allocated 0 pages Jan 16 17:56:49.417786 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 2.00 MiB page Jan 16 17:56:49.417794 kernel: HugeTLB: registered 64.0 KiB page size, pre-allocated 0 pages Jan 16 17:56:49.417801 kernel: HugeTLB: 0 KiB vmemmap can be freed for a 64.0 KiB page Jan 16 17:56:49.417809 kernel: ACPI: Added _OSI(Module Device) Jan 16 17:56:49.417816 kernel: ACPI: Added _OSI(Processor Device) Jan 16 17:56:49.417824 kernel: ACPI: Added _OSI(Processor Aggregator Device) Jan 16 17:56:49.417831 kernel: ACPI: 1 ACPI AML tables successfully acquired and loaded Jan 16 17:56:49.417840 kernel: ACPI: Interpreter enabled Jan 16 17:56:49.417848 kernel: ACPI: Using GIC for interrupt routing Jan 16 17:56:49.417855 kernel: ACPI: MCFG table detected, 1 entries Jan 16 17:56:49.417863 kernel: ACPI: CPU0 has been hot-added Jan 16 17:56:49.417870 kernel: ACPI: CPU1 has been hot-added Jan 16 17:56:49.417878 kernel: ARMH0011:00: ttyAMA0 at MMIO 0x9000000 (irq = 12, base_baud = 0) is a SBSA Jan 16 17:56:49.417886 kernel: printk: legacy console [ttyAMA0] enabled Jan 16 17:56:49.417895 kernel: ACPI: PCI Root Bridge [PCI0] (domain 0000 [bus 00-ff]) Jan 16 17:56:49.418101 kernel: acpi PNP0A08:00: _OSC: OS supports [ExtendedConfig ASPM ClockPM Segments MSI HPX-Type3] Jan 16 17:56:49.418194 kernel: acpi PNP0A08:00: _OSC: platform does not support [LTR] Jan 16 17:56:49.418274 kernel: acpi PNP0A08:00: _OSC: OS now controls [PCIeHotplug PME AER PCIeCapability] Jan 16 17:56:49.418353 kernel: acpi PNP0A08:00: ECAM area [mem 0x4010000000-0x401fffffff] reserved by PNP0C02:00 Jan 16 17:56:49.418434 kernel: acpi PNP0A08:00: ECAM at [mem 0x4010000000-0x401fffffff] for [bus 00-ff] Jan 16 17:56:49.418448 kernel: ACPI: Remapped I/O 0x000000003eff0000 to [io 0x0000-0xffff window] Jan 16 17:56:49.418456 kernel: PCI host bridge to bus 0000:00 Jan 16 17:56:49.418540 kernel: pci_bus 0000:00: root bus resource [mem 0x10000000-0x3efeffff window] Jan 16 17:56:49.420740 kernel: pci_bus 0000:00: root bus resource [io 0x0000-0xffff window] Jan 16 17:56:49.420818 kernel: pci_bus 0000:00: root bus resource [mem 0x8000000000-0xffffffffff window] Jan 16 17:56:49.420890 kernel: pci_bus 0000:00: root bus resource [bus 00-ff] Jan 16 17:56:49.421047 kernel: pci 0000:00:00.0: [1b36:0008] type 00 class 0x060000 conventional PCI endpoint Jan 16 17:56:49.421144 kernel: pci 0000:00:01.0: [1af4:1050] type 00 class 0x038000 conventional PCI endpoint Jan 16 17:56:49.421234 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11289000-0x11289fff] Jan 16 17:56:49.421315 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref] Jan 16 17:56:49.421403 kernel: pci 0000:00:02.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:56:49.421486 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11288000-0x11288fff] Jan 16 17:56:49.422051 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jan 16 17:56:49.422157 kernel: pci 0000:00:02.0: bridge window [mem 0x11000000-0x111fffff] Jan 16 17:56:49.422239 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80000fffff 64bit pref] Jan 16 17:56:49.422329 kernel: pci 0000:00:02.1: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:56:49.422410 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11287000-0x11287fff] Jan 16 17:56:49.422496 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jan 16 17:56:49.423589 kernel: pci 0000:00:02.1: bridge window [mem 0x10e00000-0x10ffffff] Jan 16 17:56:49.423692 kernel: pci 0000:00:02.2: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:56:49.423774 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11286000-0x11286fff] Jan 16 17:56:49.423854 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jan 16 17:56:49.423979 kernel: pci 0000:00:02.2: bridge window [mem 0x10c00000-0x10dfffff] Jan 16 17:56:49.424067 kernel: pci 0000:00:02.2: bridge window [mem 0x8000100000-0x80001fffff 64bit pref] Jan 16 17:56:49.424159 kernel: pci 0000:00:02.3: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:56:49.424239 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11285000-0x11285fff] Jan 16 17:56:49.424318 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jan 16 17:56:49.424396 kernel: pci 0000:00:02.3: bridge window [mem 0x10a00000-0x10bfffff] Jan 16 17:56:49.424480 kernel: pci 0000:00:02.3: bridge window [mem 0x8000200000-0x80002fffff 64bit pref] Jan 16 17:56:49.424594 kernel: pci 0000:00:02.4: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:56:49.424679 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11284000-0x11284fff] Jan 16 17:56:49.424757 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jan 16 17:56:49.424835 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Jan 16 17:56:49.424913 kernel: pci 0000:00:02.4: bridge window [mem 0x8000300000-0x80003fffff 64bit pref] Jan 16 17:56:49.425022 kernel: pci 0000:00:02.5: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:56:49.425105 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11283000-0x11283fff] Jan 16 17:56:49.425184 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jan 16 17:56:49.425263 kernel: pci 0000:00:02.5: bridge window [mem 0x10600000-0x107fffff] Jan 16 17:56:49.425341 kernel: pci 0000:00:02.5: bridge window [mem 0x8000400000-0x80004fffff 64bit pref] Jan 16 17:56:49.425425 kernel: pci 0000:00:02.6: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:56:49.425506 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11282000-0x11282fff] Jan 16 17:56:49.425692 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jan 16 17:56:49.425779 kernel: pci 0000:00:02.6: bridge window [mem 0x10400000-0x105fffff] Jan 16 17:56:49.426928 kernel: pci 0000:00:02.6: bridge window [mem 0x8000500000-0x80005fffff 64bit pref] Jan 16 17:56:49.427062 kernel: pci 0000:00:02.7: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:56:49.427145 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11281000-0x11281fff] Jan 16 17:56:49.427231 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jan 16 17:56:49.427309 kernel: pci 0000:00:02.7: bridge window [mem 0x10200000-0x103fffff] Jan 16 17:56:49.427395 kernel: pci 0000:00:03.0: [1b36:000c] type 01 class 0x060400 PCIe Root Port Jan 16 17:56:49.427475 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11280000-0x11280fff] Jan 16 17:56:49.427589 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jan 16 17:56:49.427679 kernel: pci 0000:00:03.0: bridge window [mem 0x10000000-0x101fffff] Jan 16 17:56:49.427767 kernel: pci 0000:00:04.0: [1b36:0002] type 00 class 0x070002 conventional PCI endpoint Jan 16 17:56:49.427848 kernel: pci 0000:00:04.0: BAR 0 [io 0x0000-0x0007] Jan 16 17:56:49.427956 kernel: pci 0000:01:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 16 17:56:49.428046 kernel: pci 0000:01:00.0: BAR 1 [mem 0x11000000-0x11000fff] Jan 16 17:56:49.428131 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref] Jan 16 17:56:49.428213 kernel: pci 0000:01:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 16 17:56:49.428305 kernel: pci 0000:02:00.0: [1b36:000d] type 00 class 0x0c0330 PCIe Endpoint Jan 16 17:56:49.428387 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10e00000-0x10e03fff 64bit] Jan 16 17:56:49.428476 kernel: pci 0000:03:00.0: [1af4:1043] type 00 class 0x078000 PCIe Endpoint Jan 16 17:56:49.431224 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10c00000-0x10c00fff] Jan 16 17:56:49.431357 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000100000-0x8000103fff 64bit pref] Jan 16 17:56:49.431459 kernel: pci 0000:04:00.0: [1af4:1045] type 00 class 0x00ff00 PCIe Endpoint Jan 16 17:56:49.431563 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000200000-0x8000203fff 64bit pref] Jan 16 17:56:49.431673 kernel: pci 0000:05:00.0: [1af4:1044] type 00 class 0x00ff00 PCIe Endpoint Jan 16 17:56:49.431760 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff] Jan 16 17:56:49.431846 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000300000-0x8000303fff 64bit pref] Jan 16 17:56:49.431950 kernel: pci 0000:06:00.0: [1af4:1048] type 00 class 0x010000 PCIe Endpoint Jan 16 17:56:49.432038 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10600000-0x10600fff] Jan 16 17:56:49.432120 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref] Jan 16 17:56:49.432211 kernel: pci 0000:07:00.0: [1af4:1041] type 00 class 0x020000 PCIe Endpoint Jan 16 17:56:49.432294 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10400000-0x10400fff] Jan 16 17:56:49.432378 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000500000-0x8000503fff 64bit pref] Jan 16 17:56:49.432459 kernel: pci 0000:07:00.0: ROM [mem 0xfff80000-0xffffffff pref] Jan 16 17:56:49.432542 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x0fff] to [bus 01] add_size 1000 Jan 16 17:56:49.432636 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 01] add_size 100000 add_align 100000 Jan 16 17:56:49.432716 kernel: pci 0000:00:02.0: bridge window [mem 0x00100000-0x001fffff] to [bus 01] add_size 100000 add_align 100000 Jan 16 17:56:49.432799 kernel: pci 0000:00:02.1: bridge window [io 0x1000-0x0fff] to [bus 02] add_size 1000 Jan 16 17:56:49.432912 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 02] add_size 200000 add_align 100000 Jan 16 17:56:49.433016 kernel: pci 0000:00:02.1: bridge window [mem 0x00100000-0x001fffff] to [bus 02] add_size 100000 add_align 100000 Jan 16 17:56:49.433103 kernel: pci 0000:00:02.2: bridge window [io 0x1000-0x0fff] to [bus 03] add_size 1000 Jan 16 17:56:49.433185 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 03] add_size 100000 add_align 100000 Jan 16 17:56:49.433264 kernel: pci 0000:00:02.2: bridge window [mem 0x00100000-0x001fffff] to [bus 03] add_size 100000 add_align 100000 Jan 16 17:56:49.433351 kernel: pci 0000:00:02.3: bridge window [io 0x1000-0x0fff] to [bus 04] add_size 1000 Jan 16 17:56:49.433431 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 04] add_size 100000 add_align 100000 Jan 16 17:56:49.433509 kernel: pci 0000:00:02.3: bridge window [mem 0x00100000-0x000fffff] to [bus 04] add_size 200000 add_align 100000 Jan 16 17:56:49.433625 kernel: pci 0000:00:02.4: bridge window [io 0x1000-0x0fff] to [bus 05] add_size 1000 Jan 16 17:56:49.433837 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 05] add_size 100000 add_align 100000 Jan 16 17:56:49.433925 kernel: pci 0000:00:02.4: bridge window [mem 0x00100000-0x001fffff] to [bus 05] add_size 100000 add_align 100000 Jan 16 17:56:49.435109 kernel: pci 0000:00:02.5: bridge window [io 0x1000-0x0fff] to [bus 06] add_size 1000 Jan 16 17:56:49.435207 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 06] add_size 100000 add_align 100000 Jan 16 17:56:49.435289 kernel: pci 0000:00:02.5: bridge window [mem 0x00100000-0x001fffff] to [bus 06] add_size 100000 add_align 100000 Jan 16 17:56:49.435372 kernel: pci 0000:00:02.6: bridge window [io 0x1000-0x0fff] to [bus 07] add_size 1000 Jan 16 17:56:49.435453 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff 64bit pref] to [bus 07] add_size 100000 add_align 100000 Jan 16 17:56:49.435533 kernel: pci 0000:00:02.6: bridge window [mem 0x00100000-0x001fffff] to [bus 07] add_size 100000 add_align 100000 Jan 16 17:56:49.436724 kernel: pci 0000:00:02.7: bridge window [io 0x1000-0x0fff] to [bus 08] add_size 1000 Jan 16 17:56:49.436815 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 08] add_size 200000 add_align 100000 Jan 16 17:56:49.436897 kernel: pci 0000:00:02.7: bridge window [mem 0x00100000-0x000fffff] to [bus 08] add_size 200000 add_align 100000 Jan 16 17:56:49.437041 kernel: pci 0000:00:03.0: bridge window [io 0x1000-0x0fff] to [bus 09] add_size 1000 Jan 16 17:56:49.437129 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff 64bit pref] to [bus 09] add_size 200000 add_align 100000 Jan 16 17:56:49.437217 kernel: pci 0000:00:03.0: bridge window [mem 0x00100000-0x000fffff] to [bus 09] add_size 200000 add_align 100000 Jan 16 17:56:49.437301 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff]: assigned Jan 16 17:56:49.437381 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref]: assigned Jan 16 17:56:49.437462 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff]: assigned Jan 16 17:56:49.437543 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref]: assigned Jan 16 17:56:49.437644 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff]: assigned Jan 16 17:56:49.437728 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref]: assigned Jan 16 17:56:49.437813 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff]: assigned Jan 16 17:56:49.437892 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref]: assigned Jan 16 17:56:49.437993 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff]: assigned Jan 16 17:56:49.438075 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref]: assigned Jan 16 17:56:49.438155 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff]: assigned Jan 16 17:56:49.438235 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref]: assigned Jan 16 17:56:49.438319 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff]: assigned Jan 16 17:56:49.438398 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref]: assigned Jan 16 17:56:49.438479 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff]: assigned Jan 16 17:56:49.439660 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref]: assigned Jan 16 17:56:49.439795 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff]: assigned Jan 16 17:56:49.439877 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref]: assigned Jan 16 17:56:49.440027 kernel: pci 0000:00:01.0: BAR 4 [mem 0x8001200000-0x8001203fff 64bit pref]: assigned Jan 16 17:56:49.440116 kernel: pci 0000:00:01.0: BAR 1 [mem 0x11200000-0x11200fff]: assigned Jan 16 17:56:49.440197 kernel: pci 0000:00:02.0: BAR 0 [mem 0x11201000-0x11201fff]: assigned Jan 16 17:56:49.440277 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff]: assigned Jan 16 17:56:49.440358 kernel: pci 0000:00:02.1: BAR 0 [mem 0x11202000-0x11202fff]: assigned Jan 16 17:56:49.440441 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff]: assigned Jan 16 17:56:49.440523 kernel: pci 0000:00:02.2: BAR 0 [mem 0x11203000-0x11203fff]: assigned Jan 16 17:56:49.442386 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff]: assigned Jan 16 17:56:49.442490 kernel: pci 0000:00:02.3: BAR 0 [mem 0x11204000-0x11204fff]: assigned Jan 16 17:56:49.442642 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff]: assigned Jan 16 17:56:49.442740 kernel: pci 0000:00:02.4: BAR 0 [mem 0x11205000-0x11205fff]: assigned Jan 16 17:56:49.442821 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff]: assigned Jan 16 17:56:49.442904 kernel: pci 0000:00:02.5: BAR 0 [mem 0x11206000-0x11206fff]: assigned Jan 16 17:56:49.443041 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff]: assigned Jan 16 17:56:49.443130 kernel: pci 0000:00:02.6: BAR 0 [mem 0x11207000-0x11207fff]: assigned Jan 16 17:56:49.443211 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff]: assigned Jan 16 17:56:49.443295 kernel: pci 0000:00:02.7: BAR 0 [mem 0x11208000-0x11208fff]: assigned Jan 16 17:56:49.443377 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff]: assigned Jan 16 17:56:49.443458 kernel: pci 0000:00:03.0: BAR 0 [mem 0x11209000-0x11209fff]: assigned Jan 16 17:56:49.443537 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff]: assigned Jan 16 17:56:49.444746 kernel: pci 0000:00:04.0: BAR 0 [io 0xa000-0xa007]: assigned Jan 16 17:56:49.444849 kernel: pci 0000:01:00.0: ROM [mem 0x10000000-0x1007ffff pref]: assigned Jan 16 17:56:49.444949 kernel: pci 0000:01:00.0: BAR 4 [mem 0x8000000000-0x8000003fff 64bit pref]: assigned Jan 16 17:56:49.445051 kernel: pci 0000:01:00.0: BAR 1 [mem 0x10080000-0x10080fff]: assigned Jan 16 17:56:49.445136 kernel: pci 0000:00:02.0: PCI bridge to [bus 01] Jan 16 17:56:49.445216 kernel: pci 0000:00:02.0: bridge window [io 0x1000-0x1fff] Jan 16 17:56:49.445294 kernel: pci 0000:00:02.0: bridge window [mem 0x10000000-0x101fffff] Jan 16 17:56:49.445373 kernel: pci 0000:00:02.0: bridge window [mem 0x8000000000-0x80001fffff 64bit pref] Jan 16 17:56:49.445480 kernel: pci 0000:02:00.0: BAR 0 [mem 0x10200000-0x10203fff 64bit]: assigned Jan 16 17:56:49.445579 kernel: pci 0000:00:02.1: PCI bridge to [bus 02] Jan 16 17:56:49.446386 kernel: pci 0000:00:02.1: bridge window [io 0x2000-0x2fff] Jan 16 17:56:49.446476 kernel: pci 0000:00:02.1: bridge window [mem 0x10200000-0x103fffff] Jan 16 17:56:49.446583 kernel: pci 0000:00:02.1: bridge window [mem 0x8000200000-0x80003fffff 64bit pref] Jan 16 17:56:49.446682 kernel: pci 0000:03:00.0: BAR 4 [mem 0x8000400000-0x8000403fff 64bit pref]: assigned Jan 16 17:56:49.446770 kernel: pci 0000:03:00.0: BAR 1 [mem 0x10400000-0x10400fff]: assigned Jan 16 17:56:49.446850 kernel: pci 0000:00:02.2: PCI bridge to [bus 03] Jan 16 17:56:49.446942 kernel: pci 0000:00:02.2: bridge window [io 0x3000-0x3fff] Jan 16 17:56:49.447032 kernel: pci 0000:00:02.2: bridge window [mem 0x10400000-0x105fffff] Jan 16 17:56:49.447112 kernel: pci 0000:00:02.2: bridge window [mem 0x8000400000-0x80005fffff 64bit pref] Jan 16 17:56:49.447213 kernel: pci 0000:04:00.0: BAR 4 [mem 0x8000600000-0x8000603fff 64bit pref]: assigned Jan 16 17:56:49.447298 kernel: pci 0000:00:02.3: PCI bridge to [bus 04] Jan 16 17:56:49.447376 kernel: pci 0000:00:02.3: bridge window [io 0x4000-0x4fff] Jan 16 17:56:49.447457 kernel: pci 0000:00:02.3: bridge window [mem 0x10600000-0x107fffff] Jan 16 17:56:49.447536 kernel: pci 0000:00:02.3: bridge window [mem 0x8000600000-0x80007fffff 64bit pref] Jan 16 17:56:49.448506 kernel: pci 0000:05:00.0: BAR 4 [mem 0x8000800000-0x8000803fff 64bit pref]: assigned Jan 16 17:56:49.448622 kernel: pci 0000:05:00.0: BAR 1 [mem 0x10800000-0x10800fff]: assigned Jan 16 17:56:49.448712 kernel: pci 0000:00:02.4: PCI bridge to [bus 05] Jan 16 17:56:49.448793 kernel: pci 0000:00:02.4: bridge window [io 0x5000-0x5fff] Jan 16 17:56:49.448873 kernel: pci 0000:00:02.4: bridge window [mem 0x10800000-0x109fffff] Jan 16 17:56:49.448970 kernel: pci 0000:00:02.4: bridge window [mem 0x8000800000-0x80009fffff 64bit pref] Jan 16 17:56:49.449063 kernel: pci 0000:06:00.0: BAR 4 [mem 0x8000a00000-0x8000a03fff 64bit pref]: assigned Jan 16 17:56:49.449146 kernel: pci 0000:06:00.0: BAR 1 [mem 0x10a00000-0x10a00fff]: assigned Jan 16 17:56:49.449229 kernel: pci 0000:00:02.5: PCI bridge to [bus 06] Jan 16 17:56:49.449321 kernel: pci 0000:00:02.5: bridge window [io 0x6000-0x6fff] Jan 16 17:56:49.449401 kernel: pci 0000:00:02.5: bridge window [mem 0x10a00000-0x10bfffff] Jan 16 17:56:49.449479 kernel: pci 0000:00:02.5: bridge window [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 16 17:56:49.451148 kernel: pci 0000:07:00.0: ROM [mem 0x10c00000-0x10c7ffff pref]: assigned Jan 16 17:56:49.451260 kernel: pci 0000:07:00.0: BAR 4 [mem 0x8000c00000-0x8000c03fff 64bit pref]: assigned Jan 16 17:56:49.451350 kernel: pci 0000:07:00.0: BAR 1 [mem 0x10c80000-0x10c80fff]: assigned Jan 16 17:56:49.451435 kernel: pci 0000:00:02.6: PCI bridge to [bus 07] Jan 16 17:56:49.451518 kernel: pci 0000:00:02.6: bridge window [io 0x7000-0x7fff] Jan 16 17:56:49.451761 kernel: pci 0000:00:02.6: bridge window [mem 0x10c00000-0x10dfffff] Jan 16 17:56:49.454243 kernel: pci 0000:00:02.6: bridge window [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 16 17:56:49.454332 kernel: pci 0000:00:02.7: PCI bridge to [bus 08] Jan 16 17:56:49.454414 kernel: pci 0000:00:02.7: bridge window [io 0x8000-0x8fff] Jan 16 17:56:49.454495 kernel: pci 0000:00:02.7: bridge window [mem 0x10e00000-0x10ffffff] Jan 16 17:56:49.454619 kernel: pci 0000:00:02.7: bridge window [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 16 17:56:49.454720 kernel: pci 0000:00:03.0: PCI bridge to [bus 09] Jan 16 17:56:49.454807 kernel: pci 0000:00:03.0: bridge window [io 0x9000-0x9fff] Jan 16 17:56:49.454895 kernel: pci 0000:00:03.0: bridge window [mem 0x11000000-0x111fffff] Jan 16 17:56:49.454997 kernel: pci 0000:00:03.0: bridge window [mem 0x8001000000-0x80011fffff 64bit pref] Jan 16 17:56:49.455082 kernel: pci_bus 0000:00: resource 4 [mem 0x10000000-0x3efeffff window] Jan 16 17:56:49.455156 kernel: pci_bus 0000:00: resource 5 [io 0x0000-0xffff window] Jan 16 17:56:49.455232 kernel: pci_bus 0000:00: resource 6 [mem 0x8000000000-0xffffffffff window] Jan 16 17:56:49.455316 kernel: pci_bus 0000:01: resource 0 [io 0x1000-0x1fff] Jan 16 17:56:49.455394 kernel: pci_bus 0000:01: resource 1 [mem 0x10000000-0x101fffff] Jan 16 17:56:49.455479 kernel: pci_bus 0000:01: resource 2 [mem 0x8000000000-0x80001fffff 64bit pref] Jan 16 17:56:49.455828 kernel: pci_bus 0000:02: resource 0 [io 0x2000-0x2fff] Jan 16 17:56:49.455947 kernel: pci_bus 0000:02: resource 1 [mem 0x10200000-0x103fffff] Jan 16 17:56:49.456035 kernel: pci_bus 0000:02: resource 2 [mem 0x8000200000-0x80003fffff 64bit pref] Jan 16 17:56:49.456119 kernel: pci_bus 0000:03: resource 0 [io 0x3000-0x3fff] Jan 16 17:56:49.456194 kernel: pci_bus 0000:03: resource 1 [mem 0x10400000-0x105fffff] Jan 16 17:56:49.456658 kernel: pci_bus 0000:03: resource 2 [mem 0x8000400000-0x80005fffff 64bit pref] Jan 16 17:56:49.456782 kernel: pci_bus 0000:04: resource 0 [io 0x4000-0x4fff] Jan 16 17:56:49.456859 kernel: pci_bus 0000:04: resource 1 [mem 0x10600000-0x107fffff] Jan 16 17:56:49.456977 kernel: pci_bus 0000:04: resource 2 [mem 0x8000600000-0x80007fffff 64bit pref] Jan 16 17:56:49.457077 kernel: pci_bus 0000:05: resource 0 [io 0x5000-0x5fff] Jan 16 17:56:49.457154 kernel: pci_bus 0000:05: resource 1 [mem 0x10800000-0x109fffff] Jan 16 17:56:49.457228 kernel: pci_bus 0000:05: resource 2 [mem 0x8000800000-0x80009fffff 64bit pref] Jan 16 17:56:49.457315 kernel: pci_bus 0000:06: resource 0 [io 0x6000-0x6fff] Jan 16 17:56:49.457389 kernel: pci_bus 0000:06: resource 1 [mem 0x10a00000-0x10bfffff] Jan 16 17:56:49.457464 kernel: pci_bus 0000:06: resource 2 [mem 0x8000a00000-0x8000bfffff 64bit pref] Jan 16 17:56:49.457594 kernel: pci_bus 0000:07: resource 0 [io 0x7000-0x7fff] Jan 16 17:56:49.457678 kernel: pci_bus 0000:07: resource 1 [mem 0x10c00000-0x10dfffff] Jan 16 17:56:49.457760 kernel: pci_bus 0000:07: resource 2 [mem 0x8000c00000-0x8000dfffff 64bit pref] Jan 16 17:56:49.457846 kernel: pci_bus 0000:08: resource 0 [io 0x8000-0x8fff] Jan 16 17:56:49.457920 kernel: pci_bus 0000:08: resource 1 [mem 0x10e00000-0x10ffffff] Jan 16 17:56:49.458015 kernel: pci_bus 0000:08: resource 2 [mem 0x8000e00000-0x8000ffffff 64bit pref] Jan 16 17:56:49.458099 kernel: pci_bus 0000:09: resource 0 [io 0x9000-0x9fff] Jan 16 17:56:49.458173 kernel: pci_bus 0000:09: resource 1 [mem 0x11000000-0x111fffff] Jan 16 17:56:49.458251 kernel: pci_bus 0000:09: resource 2 [mem 0x8001000000-0x80011fffff 64bit pref] Jan 16 17:56:49.458262 kernel: ACPI: PCI: Interrupt link GSI0 configured for IRQ 35 Jan 16 17:56:49.458271 kernel: ACPI: PCI: Interrupt link GSI1 configured for IRQ 36 Jan 16 17:56:49.458280 kernel: ACPI: PCI: Interrupt link GSI2 configured for IRQ 37 Jan 16 17:56:49.458288 kernel: ACPI: PCI: Interrupt link GSI3 configured for IRQ 38 Jan 16 17:56:49.458296 kernel: iommu: Default domain type: Translated Jan 16 17:56:49.458305 kernel: iommu: DMA domain TLB invalidation policy: strict mode Jan 16 17:56:49.458315 kernel: efivars: Registered efivars operations Jan 16 17:56:49.458323 kernel: vgaarb: loaded Jan 16 17:56:49.458331 kernel: clocksource: Switched to clocksource arch_sys_counter Jan 16 17:56:49.458340 kernel: VFS: Disk quotas dquot_6.6.0 Jan 16 17:56:49.458348 kernel: VFS: Dquot-cache hash table entries: 512 (order 0, 4096 bytes) Jan 16 17:56:49.458357 kernel: pnp: PnP ACPI init Jan 16 17:56:49.458454 kernel: system 00:00: [mem 0x4010000000-0x401fffffff window] could not be reserved Jan 16 17:56:49.458468 kernel: pnp: PnP ACPI: found 1 devices Jan 16 17:56:49.458476 kernel: NET: Registered PF_INET protocol family Jan 16 17:56:49.458484 kernel: IP idents hash table entries: 65536 (order: 7, 524288 bytes, linear) Jan 16 17:56:49.458493 kernel: tcp_listen_portaddr_hash hash table entries: 2048 (order: 3, 32768 bytes, linear) Jan 16 17:56:49.458501 kernel: Table-perturb hash table entries: 65536 (order: 6, 262144 bytes, linear) Jan 16 17:56:49.458509 kernel: TCP established hash table entries: 32768 (order: 6, 262144 bytes, linear) Jan 16 17:56:49.458517 kernel: TCP bind hash table entries: 32768 (order: 8, 1048576 bytes, linear) Jan 16 17:56:49.458527 kernel: TCP: Hash tables configured (established 32768 bind 32768) Jan 16 17:56:49.458535 kernel: UDP hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 16 17:56:49.458543 kernel: UDP-Lite hash table entries: 2048 (order: 4, 65536 bytes, linear) Jan 16 17:56:49.458587 kernel: NET: Registered PF_UNIX/PF_LOCAL protocol family Jan 16 17:56:49.458688 kernel: pci 0000:02:00.0: enabling device (0000 -> 0002) Jan 16 17:56:49.458700 kernel: PCI: CLS 0 bytes, default 64 Jan 16 17:56:49.458712 kernel: kvm [1]: HYP mode not available Jan 16 17:56:49.458721 kernel: Initialise system trusted keyrings Jan 16 17:56:49.458729 kernel: workingset: timestamp_bits=39 max_order=20 bucket_order=0 Jan 16 17:56:49.458738 kernel: Key type asymmetric registered Jan 16 17:56:49.458746 kernel: Asymmetric key parser 'x509' registered Jan 16 17:56:49.458754 kernel: Block layer SCSI generic (bsg) driver version 0.4 loaded (major 249) Jan 16 17:56:49.458762 kernel: io scheduler mq-deadline registered Jan 16 17:56:49.458770 kernel: io scheduler kyber registered Jan 16 17:56:49.458780 kernel: io scheduler bfq registered Jan 16 17:56:49.458788 kernel: ACPI: \_SB_.PCI0.GSI2: Enabled at IRQ 37 Jan 16 17:56:49.460704 kernel: pcieport 0000:00:02.0: PME: Signaling with IRQ 50 Jan 16 17:56:49.460805 kernel: pcieport 0000:00:02.0: AER: enabled with IRQ 50 Jan 16 17:56:49.460888 kernel: pcieport 0000:00:02.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:56:49.460992 kernel: pcieport 0000:00:02.1: PME: Signaling with IRQ 51 Jan 16 17:56:49.461091 kernel: pcieport 0000:00:02.1: AER: enabled with IRQ 51 Jan 16 17:56:49.461179 kernel: pcieport 0000:00:02.1: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:56:49.461264 kernel: pcieport 0000:00:02.2: PME: Signaling with IRQ 52 Jan 16 17:56:49.461345 kernel: pcieport 0000:00:02.2: AER: enabled with IRQ 52 Jan 16 17:56:49.461424 kernel: pcieport 0000:00:02.2: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:56:49.461512 kernel: pcieport 0000:00:02.3: PME: Signaling with IRQ 53 Jan 16 17:56:49.463180 kernel: pcieport 0000:00:02.3: AER: enabled with IRQ 53 Jan 16 17:56:49.463289 kernel: pcieport 0000:00:02.3: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:56:49.463376 kernel: pcieport 0000:00:02.4: PME: Signaling with IRQ 54 Jan 16 17:56:49.463458 kernel: pcieport 0000:00:02.4: AER: enabled with IRQ 54 Jan 16 17:56:49.463540 kernel: pcieport 0000:00:02.4: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:56:49.465209 kernel: pcieport 0000:00:02.5: PME: Signaling with IRQ 55 Jan 16 17:56:49.465306 kernel: pcieport 0000:00:02.5: AER: enabled with IRQ 55 Jan 16 17:56:49.465394 kernel: pcieport 0000:00:02.5: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:56:49.465478 kernel: pcieport 0000:00:02.6: PME: Signaling with IRQ 56 Jan 16 17:56:49.465573 kernel: pcieport 0000:00:02.6: AER: enabled with IRQ 56 Jan 16 17:56:49.468194 kernel: pcieport 0000:00:02.6: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:56:49.468293 kernel: pcieport 0000:00:02.7: PME: Signaling with IRQ 57 Jan 16 17:56:49.468379 kernel: pcieport 0000:00:02.7: AER: enabled with IRQ 57 Jan 16 17:56:49.468462 kernel: pcieport 0000:00:02.7: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:56:49.468479 kernel: ACPI: \_SB_.PCI0.GSI3: Enabled at IRQ 38 Jan 16 17:56:49.469146 kernel: pcieport 0000:00:03.0: PME: Signaling with IRQ 58 Jan 16 17:56:49.469256 kernel: pcieport 0000:00:03.0: AER: enabled with IRQ 58 Jan 16 17:56:49.469341 kernel: pcieport 0000:00:03.0: pciehp: Slot #0 AttnBtn+ PwrCtrl+ MRL- AttnInd+ PwrInd+ HotPlug+ Surprise+ Interlock+ NoCompl- IbPresDis- LLActRep+ Jan 16 17:56:49.469352 kernel: input: Power Button as /devices/LNXSYSTM:00/LNXSYBUS:00/PNP0C0C:00/input/input0 Jan 16 17:56:49.469361 kernel: ACPI: button: Power Button [PWRB] Jan 16 17:56:49.469375 kernel: ACPI: \_SB_.PCI0.GSI1: Enabled at IRQ 36 Jan 16 17:56:49.469463 kernel: virtio-pci 0000:04:00.0: enabling device (0000 -> 0002) Jan 16 17:56:49.469579 kernel: virtio-pci 0000:07:00.0: enabling device (0000 -> 0002) Jan 16 17:56:49.469593 kernel: Serial: 8250/16550 driver, 4 ports, IRQ sharing enabled Jan 16 17:56:49.469602 kernel: ACPI: \_SB_.PCI0.GSI0: Enabled at IRQ 35 Jan 16 17:56:49.469692 kernel: serial 0000:00:04.0: enabling device (0000 -> 0001) Jan 16 17:56:49.469704 kernel: 0000:00:04.0: ttyS0 at I/O 0xa000 (irq = 45, base_baud = 115200) is a 16550A Jan 16 17:56:49.469715 kernel: thunder_xcv, ver 1.0 Jan 16 17:56:49.469724 kernel: thunder_bgx, ver 1.0 Jan 16 17:56:49.469732 kernel: nicpf, ver 1.0 Jan 16 17:56:49.469740 kernel: nicvf, ver 1.0 Jan 16 17:56:49.469856 kernel: rtc-efi rtc-efi.0: registered as rtc0 Jan 16 17:56:49.469955 kernel: rtc-efi rtc-efi.0: setting system clock to 2026-01-16T17:56:48 UTC (1768586208) Jan 16 17:56:49.469970 kernel: hid: raw HID events driver (C) Jiri Kosina Jan 16 17:56:49.469979 kernel: hw perfevents: enabled with armv8_pmuv3_0 PMU driver, 7 (0,8000003f) counters available Jan 16 17:56:49.469987 kernel: watchdog: NMI not fully supported Jan 16 17:56:49.469995 kernel: watchdog: Hard watchdog permanently disabled Jan 16 17:56:49.470003 kernel: NET: Registered PF_INET6 protocol family Jan 16 17:56:49.470011 kernel: Segment Routing with IPv6 Jan 16 17:56:49.470019 kernel: In-situ OAM (IOAM) with IPv6 Jan 16 17:56:49.470028 kernel: NET: Registered PF_PACKET protocol family Jan 16 17:56:49.470037 kernel: Key type dns_resolver registered Jan 16 17:56:49.470046 kernel: registered taskstats version 1 Jan 16 17:56:49.470054 kernel: Loading compiled-in X.509 certificates Jan 16 17:56:49.470062 kernel: Loaded X.509 cert 'Kinvolk GmbH: Module signing key for 6.12.65-flatcar: 27e3aa638f3535434dc9dbdde4239fca944d5458' Jan 16 17:56:49.470070 kernel: Demotion targets for Node 0: null Jan 16 17:56:49.470078 kernel: Key type .fscrypt registered Jan 16 17:56:49.470087 kernel: Key type fscrypt-provisioning registered Jan 16 17:56:49.470227 kernel: ima: No TPM chip found, activating TPM-bypass! Jan 16 17:56:49.470240 kernel: ima: Allocated hash algorithm: sha1 Jan 16 17:56:49.470248 kernel: ima: No architecture policies found Jan 16 17:56:49.470257 kernel: alg: No test for fips(ansi_cprng) (fips_ansi_cprng) Jan 16 17:56:49.470265 kernel: clk: Disabling unused clocks Jan 16 17:56:49.470273 kernel: PM: genpd: Disabling unused power domains Jan 16 17:56:49.470281 kernel: Freeing unused kernel memory: 12480K Jan 16 17:56:49.470293 kernel: Run /init as init process Jan 16 17:56:49.470301 kernel: with arguments: Jan 16 17:56:49.470309 kernel: /init Jan 16 17:56:49.470317 kernel: with environment: Jan 16 17:56:49.470324 kernel: HOME=/ Jan 16 17:56:49.470333 kernel: TERM=linux Jan 16 17:56:49.470341 kernel: ACPI: bus type USB registered Jan 16 17:56:49.470350 kernel: usbcore: registered new interface driver usbfs Jan 16 17:56:49.470359 kernel: usbcore: registered new interface driver hub Jan 16 17:56:49.470367 kernel: usbcore: registered new device driver usb Jan 16 17:56:49.470480 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 16 17:56:49.470601 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 1 Jan 16 17:56:49.471168 kernel: xhci_hcd 0000:02:00.0: hcc params 0x00087001 hci version 0x100 quirks 0x0000000000000010 Jan 16 17:56:49.471260 kernel: xhci_hcd 0000:02:00.0: xHCI Host Controller Jan 16 17:56:49.471349 kernel: xhci_hcd 0000:02:00.0: new USB bus registered, assigned bus number 2 Jan 16 17:56:49.471431 kernel: xhci_hcd 0000:02:00.0: Host supports USB 3.0 SuperSpeed Jan 16 17:56:49.474530 kernel: hub 1-0:1.0: USB hub found Jan 16 17:56:49.474682 kernel: hub 1-0:1.0: 4 ports detected Jan 16 17:56:49.474787 kernel: usb usb2: We don't know the algorithms for LPM for this host, disabling LPM. Jan 16 17:56:49.474892 kernel: hub 2-0:1.0: USB hub found Jan 16 17:56:49.475004 kernel: hub 2-0:1.0: 4 ports detected Jan 16 17:56:49.475016 kernel: SCSI subsystem initialized Jan 16 17:56:49.475118 kernel: virtio_scsi virtio5: 2/0/0 default/read/poll queues Jan 16 17:56:49.475215 kernel: scsi host0: Virtio SCSI HBA Jan 16 17:56:49.475319 kernel: scsi 0:0:0:0: CD-ROM QEMU QEMU CD-ROM 2.5+ PQ: 0 ANSI: 5 Jan 16 17:56:49.475419 kernel: scsi 0:0:0:1: Direct-Access QEMU QEMU HARDDISK 2.5+ PQ: 0 ANSI: 5 Jan 16 17:56:49.475507 kernel: sd 0:0:0:1: Power-on or device reset occurred Jan 16 17:56:49.475611 kernel: sd 0:0:0:1: [sda] 80003072 512-byte logical blocks: (41.0 GB/38.1 GiB) Jan 16 17:56:49.475702 kernel: sd 0:0:0:1: [sda] Write Protect is off Jan 16 17:56:49.475789 kernel: sd 0:0:0:1: [sda] Mode Sense: 63 00 00 08 Jan 16 17:56:49.475880 kernel: sd 0:0:0:1: [sda] Write cache: enabled, read cache: enabled, doesn't support DPO or FUA Jan 16 17:56:49.475891 kernel: GPT:Primary header thinks Alt. header is not at the end of the disk. Jan 16 17:56:49.475899 kernel: GPT:25804799 != 80003071 Jan 16 17:56:49.475907 kernel: GPT:Alternate GPT header not at the end of the disk. Jan 16 17:56:49.475915 kernel: GPT:25804799 != 80003071 Jan 16 17:56:49.475923 kernel: GPT: Use GNU Parted to correct GPT errors. Jan 16 17:56:49.475969 kernel: sda: sda1 sda2 sda3 sda4 sda6 sda7 sda9 Jan 16 17:56:49.476072 kernel: sd 0:0:0:1: [sda] Attached SCSI disk Jan 16 17:56:49.476163 kernel: sr 0:0:0:0: Power-on or device reset occurred Jan 16 17:56:49.476251 kernel: sr 0:0:0:0: [sr0] scsi3-mmc drive: 16x/50x cd/rw xa/form2 cdda tray Jan 16 17:56:49.476261 kernel: cdrom: Uniform CD-ROM driver Revision: 3.20 Jan 16 17:56:49.476347 kernel: sr 0:0:0:0: Attached scsi CD-ROM sr0 Jan 16 17:56:49.476357 kernel: device-mapper: core: CONFIG_IMA_DISABLE_HTABLE is disabled. Duplicate IMA measurements will not be recorded in the IMA log. Jan 16 17:56:49.476367 kernel: device-mapper: uevent: version 1.0.3 Jan 16 17:56:49.476376 kernel: device-mapper: ioctl: 4.48.0-ioctl (2023-03-01) initialised: dm-devel@lists.linux.dev Jan 16 17:56:49.476384 kernel: device-mapper: verity: sha256 using shash "sha256-ce" Jan 16 17:56:49.476392 kernel: raid6: neonx8 gen() 15385 MB/s Jan 16 17:56:49.476400 kernel: raid6: neonx4 gen() 15645 MB/s Jan 16 17:56:49.476408 kernel: raid6: neonx2 gen() 13182 MB/s Jan 16 17:56:49.476417 kernel: raid6: neonx1 gen() 10403 MB/s Jan 16 17:56:49.476426 kernel: raid6: int64x8 gen() 6814 MB/s Jan 16 17:56:49.476434 kernel: raid6: int64x4 gen() 7316 MB/s Jan 16 17:56:49.476536 kernel: usb 1-1: new high-speed USB device number 2 using xhci_hcd Jan 16 17:56:49.476579 kernel: raid6: int64x2 gen() 6014 MB/s Jan 16 17:56:49.476588 kernel: raid6: int64x1 gen() 5036 MB/s Jan 16 17:56:49.476596 kernel: raid6: using algorithm neonx4 gen() 15645 MB/s Jan 16 17:56:49.476604 kernel: raid6: .... xor() 12274 MB/s, rmw enabled Jan 16 17:56:49.476616 kernel: raid6: using neon recovery algorithm Jan 16 17:56:49.476624 kernel: xor: measuring software checksum speed Jan 16 17:56:49.476632 kernel: 8regs : 20358 MB/sec Jan 16 17:56:49.476640 kernel: 32regs : 18578 MB/sec Jan 16 17:56:49.476648 kernel: arm64_neon : 28176 MB/sec Jan 16 17:56:49.476656 kernel: xor: using function: arm64_neon (28176 MB/sec) Jan 16 17:56:49.476665 kernel: Btrfs loaded, zoned=no, fsverity=no Jan 16 17:56:49.476675 kernel: BTRFS: device fsid 772c9e2d-7e98-4acf-842c-b5416fff0f38 devid 1 transid 34 /dev/mapper/usr (254:0) scanned by mount (211) Jan 16 17:56:49.476684 kernel: BTRFS info (device dm-0): first mount of filesystem 772c9e2d-7e98-4acf-842c-b5416fff0f38 Jan 16 17:56:49.476692 kernel: BTRFS info (device dm-0): using crc32c (crc32c-generic) checksum algorithm Jan 16 17:56:49.476701 kernel: BTRFS info (device dm-0): enabling ssd optimizations Jan 16 17:56:49.476709 kernel: BTRFS info (device dm-0): disabling log replay at mount time Jan 16 17:56:49.476717 kernel: BTRFS info (device dm-0): enabling free space tree Jan 16 17:56:49.476725 kernel: loop: module loaded Jan 16 17:56:49.476735 kernel: loop0: detected capacity change from 0 to 91832 Jan 16 17:56:49.476743 kernel: squashfs: version 4.0 (2009/01/31) Phillip Lougher Jan 16 17:56:49.476854 kernel: usb 1-2: new high-speed USB device number 3 using xhci_hcd Jan 16 17:56:49.476881 systemd[1]: Successfully made /usr/ read-only. Jan 16 17:56:49.476893 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 16 17:56:49.476905 systemd[1]: Detected virtualization kvm. Jan 16 17:56:49.476913 systemd[1]: Detected architecture arm64. Jan 16 17:56:49.476922 systemd[1]: Running in initrd. Jan 16 17:56:49.476941 systemd[1]: No hostname configured, using default hostname. Jan 16 17:56:49.476953 systemd[1]: Hostname set to . Jan 16 17:56:49.476961 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 16 17:56:49.476970 systemd[1]: Queued start job for default target initrd.target. Jan 16 17:56:49.476981 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 16 17:56:49.476989 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 16 17:56:49.476998 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 16 17:56:49.477007 systemd[1]: Expecting device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - /dev/disk/by-label/EFI-SYSTEM... Jan 16 17:56:49.477015 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 16 17:56:49.477025 systemd[1]: Expecting device dev-disk-by\x2dlabel-ROOT.device - /dev/disk/by-label/ROOT... Jan 16 17:56:49.477035 systemd[1]: Expecting device dev-disk-by\x2dpartlabel-USR\x2dA.device - /dev/disk/by-partlabel/USR-A... Jan 16 17:56:49.477044 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 16 17:56:49.477053 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 16 17:56:49.477061 systemd[1]: Reached target initrd-usr-fs.target - Initrd /usr File System. Jan 16 17:56:49.477070 systemd[1]: Reached target paths.target - Path Units. Jan 16 17:56:49.477079 systemd[1]: Reached target slices.target - Slice Units. Jan 16 17:56:49.477088 systemd[1]: Reached target swap.target - Swaps. Jan 16 17:56:49.477097 systemd[1]: Reached target timers.target - Timer Units. Jan 16 17:56:49.477105 systemd[1]: Listening on iscsid.socket - Open-iSCSI iscsid Socket. Jan 16 17:56:49.477114 systemd[1]: Listening on iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 16 17:56:49.477123 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 16 17:56:49.477132 systemd[1]: Listening on systemd-journald-dev-log.socket - Journal Socket (/dev/log). Jan 16 17:56:49.477140 systemd[1]: Listening on systemd-journald.socket - Journal Sockets. Jan 16 17:56:49.477151 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 16 17:56:49.477159 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 16 17:56:49.477168 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 16 17:56:49.477176 systemd[1]: Reached target sockets.target - Socket Units. Jan 16 17:56:49.477186 systemd[1]: afterburn-network-kargs.service - Afterburn Initrd Setup Network Kernel Arguments was skipped because no trigger condition checks were met. Jan 16 17:56:49.477195 systemd[1]: Starting ignition-setup-pre.service - Ignition env setup... Jan 16 17:56:49.477203 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 16 17:56:49.477213 systemd[1]: Finished network-cleanup.service - Network Cleanup. Jan 16 17:56:49.477222 systemd[1]: systemd-battery-check.service - Check battery level during early boot was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/class/power_supply). Jan 16 17:56:49.477231 systemd[1]: Starting systemd-fsck-usr.service... Jan 16 17:56:49.477239 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 16 17:56:49.477248 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 16 17:56:49.477258 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 16 17:56:49.477267 systemd[1]: Finished ignition-setup-pre.service - Ignition env setup. Jan 16 17:56:49.477276 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 16 17:56:49.477284 systemd[1]: Finished systemd-fsck-usr.service. Jan 16 17:56:49.477293 systemd[1]: Starting systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully... Jan 16 17:56:49.477333 systemd-journald[349]: Collecting audit messages is enabled. Jan 16 17:56:49.477354 kernel: bridge: filtering via arp/ip/ip6tables is no longer available by default. Update your scripts to load br_netfilter if you need this. Jan 16 17:56:49.477362 kernel: Bridge firewalling registered Jan 16 17:56:49.477373 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 16 17:56:49.477382 kernel: audit: type=1130 audit(1768586209.435:2): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:49.477390 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 16 17:56:49.477399 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 16 17:56:49.477408 kernel: audit: type=1130 audit(1768586209.451:3): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:49.477417 systemd[1]: Starting dracut-cmdline-ask.service - dracut ask for additional cmdline parameters... Jan 16 17:56:49.477427 systemd[1]: Finished systemd-tmpfiles-setup-dev-early.service - Create Static Device Nodes in /dev gracefully. Jan 16 17:56:49.477437 kernel: audit: type=1130 audit(1768586209.463:4): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:49.477445 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 16 17:56:49.477454 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 16 17:56:49.477463 kernel: audit: type=1130 audit(1768586209.473:5): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:49.477472 systemd-journald[349]: Journal started Jan 16 17:56:49.477492 systemd-journald[349]: Runtime Journal (/run/log/journal/2975e9ee83bd40498bdd8e8c9bfab007) is 8M, max 76.5M, 68.5M free. Jan 16 17:56:49.435000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:49.451000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:49.463000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev-early comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:49.473000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:49.430210 systemd-modules-load[351]: Inserted module 'br_netfilter' Jan 16 17:56:49.477000 audit: BPF prog-id=6 op=LOAD Jan 16 17:56:49.481812 kernel: audit: type=1334 audit(1768586209.477:6): prog-id=6 op=LOAD Jan 16 17:56:49.481847 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 16 17:56:49.485562 systemd[1]: Started systemd-journald.service - Journal Service. Jan 16 17:56:49.484000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:49.488741 kernel: audit: type=1130 audit(1768586209.484:7): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:49.489000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:49.489656 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 16 17:56:49.493720 kernel: audit: type=1130 audit(1768586209.489:8): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:49.498206 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 16 17:56:49.500376 systemd[1]: Finished dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 16 17:56:49.501000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:49.503300 systemd[1]: Starting dracut-cmdline.service - dracut cmdline hook... Jan 16 17:56:49.505719 kernel: audit: type=1130 audit(1768586209.501:9): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:49.520568 dracut-cmdline[387]: dracut-109 Jan 16 17:56:49.523573 dracut-cmdline[387]: Using kernel command line parameters: SYSTEMD_SULOGIN_FORCE=1 BOOT_IMAGE=/flatcar/vmlinuz-a mount.usr=/dev/mapper/usr verity.usr=PARTUUID=7130c94a-213a-4e5a-8e26-6cce9662f132 rootflags=rw mount.usrflags=ro consoleblank=0 root=LABEL=ROOT console=ttyAMA0,115200n8 flatcar.first_boot=detected acpi=force flatcar.oem.id=hetzner verity.usrhash=924fb3eb04ba1d8edcb66284d30e3342855b0579b62556e7722bcf37e82bda13 Jan 16 17:56:49.522515 systemd-tmpfiles[385]: /usr/lib/tmpfiles.d/var.conf:14: Duplicate line for path "/var/log", ignoring. Jan 16 17:56:49.532639 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 16 17:56:49.533000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:49.539482 kernel: audit: type=1130 audit(1768586209.533:10): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:49.553545 systemd-resolved[368]: Positive Trust Anchors: Jan 16 17:56:49.554270 systemd-resolved[368]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 16 17:56:49.555309 systemd-resolved[368]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 16 17:56:49.555343 systemd-resolved[368]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 16 17:56:49.586430 systemd-resolved[368]: Defaulting to hostname 'linux'. Jan 16 17:56:49.588000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:49.588041 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 16 17:56:49.588713 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 16 17:56:49.630576 kernel: Loading iSCSI transport class v2.0-870. Jan 16 17:56:49.640639 kernel: iscsi: registered transport (tcp) Jan 16 17:56:49.656618 kernel: iscsi: registered transport (qla4xxx) Jan 16 17:56:49.656701 kernel: QLogic iSCSI HBA Driver Jan 16 17:56:49.683216 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 16 17:56:49.704505 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 16 17:56:49.708000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:49.710185 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 16 17:56:49.758081 systemd[1]: Finished dracut-cmdline.service - dracut cmdline hook. Jan 16 17:56:49.758000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:49.760158 systemd[1]: Starting dracut-pre-udev.service - dracut pre-udev hook... Jan 16 17:56:49.761432 systemd[1]: Starting parse-ip-for-networkd.service - Write systemd-networkd units from cmdline... Jan 16 17:56:49.798790 systemd[1]: Finished dracut-pre-udev.service - dracut pre-udev hook. Jan 16 17:56:49.798000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:49.799000 audit: BPF prog-id=7 op=LOAD Jan 16 17:56:49.799000 audit: BPF prog-id=8 op=LOAD Jan 16 17:56:49.801670 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 16 17:56:49.834245 systemd-udevd[620]: Using default interface naming scheme 'v257'. Jan 16 17:56:49.843086 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 16 17:56:49.844000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:49.846329 systemd[1]: Starting dracut-pre-trigger.service - dracut pre-trigger hook... Jan 16 17:56:49.880155 dracut-pre-trigger[677]: rd.md=0: removing MD RAID activation Jan 16 17:56:49.894619 systemd[1]: Finished parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 16 17:56:49.894000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:49.896000 audit: BPF prog-id=9 op=LOAD Jan 16 17:56:49.897528 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 16 17:56:49.916821 systemd[1]: Finished dracut-pre-trigger.service - dracut pre-trigger hook. Jan 16 17:56:49.916000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:49.918393 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 16 17:56:49.942128 systemd-networkd[740]: lo: Link UP Jan 16 17:56:49.942675 systemd-networkd[740]: lo: Gained carrier Jan 16 17:56:49.943000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:49.943296 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 16 17:56:49.944050 systemd[1]: Reached target network.target - Network. Jan 16 17:56:49.987762 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 16 17:56:49.987000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:49.991749 systemd[1]: Starting dracut-initqueue.service - dracut initqueue hook... Jan 16 17:56:50.122087 systemd[1]: Found device dev-disk-by\x2dlabel-ROOT.device - QEMU_HARDDISK ROOT. Jan 16 17:56:50.132844 systemd[1]: Found device dev-disk-by\x2dlabel-EFI\x2dSYSTEM.device - QEMU_HARDDISK EFI-SYSTEM. Jan 16 17:56:50.144066 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jan 16 17:56:50.154817 systemd[1]: Found device dev-disk-by\x2dpartlabel-USR\x2dA.device - QEMU_HARDDISK USR-A. Jan 16 17:56:50.156305 systemd[1]: Starting disk-uuid.service - Generate new UUID for disk GPT if necessary... Jan 16 17:56:50.177443 disk-uuid[798]: Primary Header is updated. Jan 16 17:56:50.177443 disk-uuid[798]: Secondary Entries is updated. Jan 16 17:56:50.177443 disk-uuid[798]: Secondary Header is updated. Jan 16 17:56:50.181092 kernel: input: QEMU QEMU USB Tablet as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-1/1-1:1.0/0003:0627:0001.0001/input/input1 Jan 16 17:56:50.186597 kernel: hid-generic 0003:0627:0001.0001: input,hidraw0: USB HID v0.01 Mouse [QEMU QEMU USB Tablet] on usb-0000:02:00.0-1/input0 Jan 16 17:56:50.207567 kernel: input: QEMU QEMU USB Keyboard as /devices/pci0000:00/0000:00:02.1/0000:02:00.0/usb1/1-2/1-2:1.0/0003:0627:0001.0002/input/input2 Jan 16 17:56:50.215855 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 16 17:56:50.215996 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 16 17:56:50.217000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:50.217165 systemd[1]: Stopping systemd-vconsole-setup.service - Virtual Console Setup... Jan 16 17:56:50.218317 systemd-networkd[740]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 16 17:56:50.218321 systemd-networkd[740]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 16 17:56:50.222185 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 16 17:56:50.223539 systemd-networkd[740]: eth1: Link UP Jan 16 17:56:50.223764 systemd-networkd[740]: eth1: Gained carrier Jan 16 17:56:50.223777 systemd-networkd[740]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 16 17:56:50.234095 systemd-networkd[740]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 16 17:56:50.234111 systemd-networkd[740]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 16 17:56:50.237173 systemd-networkd[740]: eth0: Link UP Jan 16 17:56:50.237402 systemd-networkd[740]: eth0: Gained carrier Jan 16 17:56:50.237416 systemd-networkd[740]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 16 17:56:50.260632 kernel: hid-generic 0003:0627:0001.0002: input,hidraw1: USB HID v1.11 Keyboard [QEMU QEMU USB Keyboard] on usb-0000:02:00.0-2/input0 Jan 16 17:56:50.260858 kernel: usbcore: registered new interface driver usbhid Jan 16 17:56:50.260870 kernel: usbhid: USB HID core driver Jan 16 17:56:50.269408 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 16 17:56:50.270000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:50.271658 systemd-networkd[740]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Jan 16 17:56:50.290643 systemd-networkd[740]: eth0: DHCPv4 address 49.12.189.56/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jan 16 17:56:50.308011 systemd[1]: Finished dracut-initqueue.service - dracut initqueue hook. Jan 16 17:56:50.308000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:50.309145 systemd[1]: Reached target remote-fs-pre.target - Preparation for Remote File Systems. Jan 16 17:56:50.312290 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 16 17:56:50.313077 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 16 17:56:50.315232 systemd[1]: Starting dracut-pre-mount.service - dracut pre-mount hook... Jan 16 17:56:50.340219 systemd[1]: Finished dracut-pre-mount.service - dracut pre-mount hook. Jan 16 17:56:50.341000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:51.214004 disk-uuid[799]: Warning: The kernel is still using the old partition table. Jan 16 17:56:51.214004 disk-uuid[799]: The new table will be used at the next reboot or after you Jan 16 17:56:51.214004 disk-uuid[799]: run partprobe(8) or kpartx(8) Jan 16 17:56:51.214004 disk-uuid[799]: The operation has completed successfully. Jan 16 17:56:51.224313 systemd[1]: disk-uuid.service: Deactivated successfully. Jan 16 17:56:51.225000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:51.225000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=disk-uuid comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:51.224456 systemd[1]: Finished disk-uuid.service - Generate new UUID for disk GPT if necessary. Jan 16 17:56:51.228789 systemd[1]: Starting ignition-setup.service - Ignition (setup)... Jan 16 17:56:51.265586 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (832) Jan 16 17:56:51.267129 kernel: BTRFS info (device sda6): first mount of filesystem 5e96ab2e-f088-4ca2-ba97-55451a1893dc Jan 16 17:56:51.267191 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Jan 16 17:56:51.270693 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 16 17:56:51.270749 kernel: BTRFS info (device sda6): turning on async discard Jan 16 17:56:51.270776 kernel: BTRFS info (device sda6): enabling free space tree Jan 16 17:56:51.278617 kernel: BTRFS info (device sda6): last unmount of filesystem 5e96ab2e-f088-4ca2-ba97-55451a1893dc Jan 16 17:56:51.279649 systemd[1]: Finished ignition-setup.service - Ignition (setup). Jan 16 17:56:51.279000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:51.281507 systemd[1]: Starting ignition-fetch-offline.service - Ignition (fetch-offline)... Jan 16 17:56:51.290954 systemd-networkd[740]: eth0: Gained IPv6LL Jan 16 17:56:51.409745 ignition[851]: Ignition 2.24.0 Jan 16 17:56:51.410313 ignition[851]: Stage: fetch-offline Jan 16 17:56:51.410360 ignition[851]: no configs at "/usr/lib/ignition/base.d" Jan 16 17:56:51.410372 ignition[851]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 16 17:56:51.410513 ignition[851]: parsed url from cmdline: "" Jan 16 17:56:51.412356 systemd[1]: Finished ignition-fetch-offline.service - Ignition (fetch-offline). Jan 16 17:56:51.413000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:51.410517 ignition[851]: no config URL provided Jan 16 17:56:51.414580 systemd[1]: Starting ignition-fetch.service - Ignition (fetch)... Jan 16 17:56:51.410521 ignition[851]: reading system config file "/usr/lib/ignition/user.ign" Jan 16 17:56:51.410530 ignition[851]: no config at "/usr/lib/ignition/user.ign" Jan 16 17:56:51.410535 ignition[851]: failed to fetch config: resource requires networking Jan 16 17:56:51.411243 ignition[851]: Ignition finished successfully Jan 16 17:56:51.447720 ignition[859]: Ignition 2.24.0 Jan 16 17:56:51.447738 ignition[859]: Stage: fetch Jan 16 17:56:51.447888 ignition[859]: no configs at "/usr/lib/ignition/base.d" Jan 16 17:56:51.447896 ignition[859]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 16 17:56:51.448027 ignition[859]: parsed url from cmdline: "" Jan 16 17:56:51.448031 ignition[859]: no config URL provided Jan 16 17:56:51.448039 ignition[859]: reading system config file "/usr/lib/ignition/user.ign" Jan 16 17:56:51.448045 ignition[859]: no config at "/usr/lib/ignition/user.ign" Jan 16 17:56:51.448074 ignition[859]: GET http://169.254.169.254/hetzner/v1/userdata: attempt #1 Jan 16 17:56:51.457335 ignition[859]: GET result: OK Jan 16 17:56:51.457476 ignition[859]: parsing config with SHA512: 90819e62995bcc18f95492f9c8685fd280e752bfba9c52512896cff9a2ff87a1d8c41b153b13a56f90f5716f7bc0817f68478f0ec7c946330562dceed47dfab0 Jan 16 17:56:51.464968 unknown[859]: fetched base config from "system" Jan 16 17:56:51.464981 unknown[859]: fetched base config from "system" Jan 16 17:56:51.465346 ignition[859]: fetch: fetch complete Jan 16 17:56:51.464987 unknown[859]: fetched user config from "hetzner" Jan 16 17:56:51.465351 ignition[859]: fetch: fetch passed Jan 16 17:56:51.467245 systemd[1]: Finished ignition-fetch.service - Ignition (fetch). Jan 16 17:56:51.465398 ignition[859]: Ignition finished successfully Jan 16 17:56:51.468000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:51.470509 systemd[1]: Starting ignition-kargs.service - Ignition (kargs)... Jan 16 17:56:51.499042 ignition[866]: Ignition 2.24.0 Jan 16 17:56:51.499058 ignition[866]: Stage: kargs Jan 16 17:56:51.499216 ignition[866]: no configs at "/usr/lib/ignition/base.d" Jan 16 17:56:51.499227 ignition[866]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 16 17:56:51.500047 ignition[866]: kargs: kargs passed Jan 16 17:56:51.500096 ignition[866]: Ignition finished successfully Jan 16 17:56:51.502390 systemd[1]: Finished ignition-kargs.service - Ignition (kargs). Jan 16 17:56:51.503000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:51.505205 systemd[1]: Starting ignition-disks.service - Ignition (disks)... Jan 16 17:56:51.533210 ignition[873]: Ignition 2.24.0 Jan 16 17:56:51.533229 ignition[873]: Stage: disks Jan 16 17:56:51.533363 ignition[873]: no configs at "/usr/lib/ignition/base.d" Jan 16 17:56:51.533372 ignition[873]: no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 16 17:56:51.538000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:51.537844 systemd[1]: Finished ignition-disks.service - Ignition (disks). Jan 16 17:56:51.534161 ignition[873]: disks: disks passed Jan 16 17:56:51.540448 systemd[1]: Reached target initrd-root-device.target - Initrd Root Device. Jan 16 17:56:51.534210 ignition[873]: Ignition finished successfully Jan 16 17:56:51.541678 systemd[1]: Reached target local-fs-pre.target - Preparation for Local File Systems. Jan 16 17:56:51.543223 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 16 17:56:51.544321 systemd[1]: Reached target sysinit.target - System Initialization. Jan 16 17:56:51.545538 systemd[1]: Reached target basic.target - Basic System. Jan 16 17:56:51.547996 systemd[1]: Starting systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT... Jan 16 17:56:51.548801 systemd-networkd[740]: eth1: Gained IPv6LL Jan 16 17:56:51.594971 systemd-fsck[881]: ROOT: clean, 15/1631200 files, 112378/1617920 blocks Jan 16 17:56:51.600824 systemd[1]: Finished systemd-fsck-root.service - File System Check on /dev/disk/by-label/ROOT. Jan 16 17:56:51.603000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:51.605642 systemd[1]: Mounting sysroot.mount - /sysroot... Jan 16 17:56:51.689630 kernel: EXT4-fs (sda9): mounted filesystem 3360ad79-d1e3-4f32-ae7d-4a8c0a3c719d r/w with ordered data mode. Quota mode: none. Jan 16 17:56:51.690721 systemd[1]: Mounted sysroot.mount - /sysroot. Jan 16 17:56:51.691774 systemd[1]: Reached target initrd-root-fs.target - Initrd Root File System. Jan 16 17:56:51.694514 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 16 17:56:51.696682 systemd[1]: Mounting sysroot-usr.mount - /sysroot/usr... Jan 16 17:56:51.715132 systemd[1]: Starting flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent... Jan 16 17:56:51.719794 systemd[1]: ignition-remount-sysroot.service - Remount /sysroot read-write for Ignition was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/sysroot). Jan 16 17:56:51.721963 systemd[1]: Reached target ignition-diskful.target - Ignition Boot Disk Setup. Jan 16 17:56:51.725656 systemd[1]: Mounted sysroot-usr.mount - /sysroot/usr. Jan 16 17:56:51.729677 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (889) Jan 16 17:56:51.729624 systemd[1]: Starting initrd-setup-root.service - Root filesystem setup... Jan 16 17:56:51.732640 kernel: BTRFS info (device sda6): first mount of filesystem 5e96ab2e-f088-4ca2-ba97-55451a1893dc Jan 16 17:56:51.732684 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Jan 16 17:56:51.737350 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 16 17:56:51.737407 kernel: BTRFS info (device sda6): turning on async discard Jan 16 17:56:51.737979 kernel: BTRFS info (device sda6): enabling free space tree Jan 16 17:56:51.741795 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 16 17:56:51.798201 coreos-metadata[891]: Jan 16 17:56:51.798 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/hostname: Attempt #1 Jan 16 17:56:51.801567 coreos-metadata[891]: Jan 16 17:56:51.800 INFO Fetch successful Jan 16 17:56:51.801567 coreos-metadata[891]: Jan 16 17:56:51.800 INFO wrote hostname ci-4580-0-0-p-03fd9ab712 to /sysroot/etc/hostname Jan 16 17:56:51.805155 systemd[1]: Finished flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jan 16 17:56:51.805000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:51.914665 systemd[1]: Finished initrd-setup-root.service - Root filesystem setup. Jan 16 17:56:51.915000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:51.919167 systemd[1]: Starting ignition-mount.service - Ignition (mount)... Jan 16 17:56:51.920969 systemd[1]: Starting sysroot-boot.service - /sysroot/boot... Jan 16 17:56:51.949475 systemd[1]: sysroot-oem.mount: Deactivated successfully. Jan 16 17:56:51.951624 kernel: BTRFS info (device sda6): last unmount of filesystem 5e96ab2e-f088-4ca2-ba97-55451a1893dc Jan 16 17:56:51.969203 systemd[1]: Finished sysroot-boot.service - /sysroot/boot. Jan 16 17:56:51.969000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:51.981575 ignition[992]: INFO : Ignition 2.24.0 Jan 16 17:56:51.981575 ignition[992]: INFO : Stage: mount Jan 16 17:56:51.981575 ignition[992]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 16 17:56:51.981575 ignition[992]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 16 17:56:51.986286 ignition[992]: INFO : mount: mount passed Jan 16 17:56:51.986286 ignition[992]: INFO : Ignition finished successfully Jan 16 17:56:51.986000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:51.985247 systemd[1]: Finished ignition-mount.service - Ignition (mount). Jan 16 17:56:51.989329 systemd[1]: Starting ignition-files.service - Ignition (files)... Jan 16 17:56:52.694217 systemd[1]: Mounting sysroot-oem.mount - /sysroot/oem... Jan 16 17:56:52.718641 kernel: BTRFS: device label OEM devid 1 transid 11 /dev/sda6 (8:6) scanned by mount (1002) Jan 16 17:56:52.721646 kernel: BTRFS info (device sda6): first mount of filesystem 5e96ab2e-f088-4ca2-ba97-55451a1893dc Jan 16 17:56:52.721729 kernel: BTRFS info (device sda6): using crc32c (crc32c-generic) checksum algorithm Jan 16 17:56:52.727581 kernel: BTRFS info (device sda6): enabling ssd optimizations Jan 16 17:56:52.727646 kernel: BTRFS info (device sda6): turning on async discard Jan 16 17:56:52.727665 kernel: BTRFS info (device sda6): enabling free space tree Jan 16 17:56:52.730687 systemd[1]: Mounted sysroot-oem.mount - /sysroot/oem. Jan 16 17:56:52.769585 ignition[1019]: INFO : Ignition 2.24.0 Jan 16 17:56:52.769585 ignition[1019]: INFO : Stage: files Jan 16 17:56:52.769585 ignition[1019]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 16 17:56:52.769585 ignition[1019]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 16 17:56:52.772253 ignition[1019]: DEBUG : files: compiled without relabeling support, skipping Jan 16 17:56:52.774072 ignition[1019]: INFO : files: ensureUsers: op(1): [started] creating or modifying user "core" Jan 16 17:56:52.774072 ignition[1019]: DEBUG : files: ensureUsers: op(1): executing: "usermod" "--root" "/sysroot" "core" Jan 16 17:56:52.778874 ignition[1019]: INFO : files: ensureUsers: op(1): [finished] creating or modifying user "core" Jan 16 17:56:52.780227 ignition[1019]: INFO : files: ensureUsers: op(2): [started] adding ssh keys to user "core" Jan 16 17:56:52.781852 unknown[1019]: wrote ssh authorized keys file for user: core Jan 16 17:56:52.783117 ignition[1019]: INFO : files: ensureUsers: op(2): [finished] adding ssh keys to user "core" Jan 16 17:56:52.784563 ignition[1019]: INFO : files: createFilesystemsFiles: createFiles: op(3): [started] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Jan 16 17:56:52.784563 ignition[1019]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET https://get.helm.sh/helm-v3.17.3-linux-arm64.tar.gz: attempt #1 Jan 16 17:56:52.887047 ignition[1019]: INFO : files: createFilesystemsFiles: createFiles: op(3): GET result: OK Jan 16 17:56:52.996416 ignition[1019]: INFO : files: createFilesystemsFiles: createFiles: op(3): [finished] writing file "/sysroot/opt/helm-v3.17.3-linux-arm64.tar.gz" Jan 16 17:56:52.996416 ignition[1019]: INFO : files: createFilesystemsFiles: createFiles: op(4): [started] writing file "/sysroot/home/core/install.sh" Jan 16 17:56:52.999667 ignition[1019]: INFO : files: createFilesystemsFiles: createFiles: op(4): [finished] writing file "/sysroot/home/core/install.sh" Jan 16 17:56:52.999667 ignition[1019]: INFO : files: createFilesystemsFiles: createFiles: op(5): [started] writing file "/sysroot/home/core/nginx.yaml" Jan 16 17:56:52.999667 ignition[1019]: INFO : files: createFilesystemsFiles: createFiles: op(5): [finished] writing file "/sysroot/home/core/nginx.yaml" Jan 16 17:56:52.999667 ignition[1019]: INFO : files: createFilesystemsFiles: createFiles: op(6): [started] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 16 17:56:52.999667 ignition[1019]: INFO : files: createFilesystemsFiles: createFiles: op(6): [finished] writing file "/sysroot/home/core/nfs-pod.yaml" Jan 16 17:56:52.999667 ignition[1019]: INFO : files: createFilesystemsFiles: createFiles: op(7): [started] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 16 17:56:52.999667 ignition[1019]: INFO : files: createFilesystemsFiles: createFiles: op(7): [finished] writing file "/sysroot/home/core/nfs-pvc.yaml" Jan 16 17:56:52.999667 ignition[1019]: INFO : files: createFilesystemsFiles: createFiles: op(8): [started] writing file "/sysroot/etc/flatcar/update.conf" Jan 16 17:56:53.008483 ignition[1019]: INFO : files: createFilesystemsFiles: createFiles: op(8): [finished] writing file "/sysroot/etc/flatcar/update.conf" Jan 16 17:56:53.008483 ignition[1019]: INFO : files: createFilesystemsFiles: createFiles: op(9): [started] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Jan 16 17:56:53.008483 ignition[1019]: INFO : files: createFilesystemsFiles: createFiles: op(9): [finished] writing link "/sysroot/etc/extensions/kubernetes.raw" -> "/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Jan 16 17:56:53.008483 ignition[1019]: INFO : files: createFilesystemsFiles: createFiles: op(a): [started] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Jan 16 17:56:53.008483 ignition[1019]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET https://extensions.flatcar.org/extensions/kubernetes-v1.34.1-arm64.raw: attempt #1 Jan 16 17:56:53.474002 ignition[1019]: INFO : files: createFilesystemsFiles: createFiles: op(a): GET result: OK Jan 16 17:56:55.242429 ignition[1019]: INFO : files: createFilesystemsFiles: createFiles: op(a): [finished] writing file "/sysroot/opt/extensions/kubernetes/kubernetes-v1.34.1-arm64.raw" Jan 16 17:56:55.242429 ignition[1019]: INFO : files: op(b): [started] processing unit "prepare-helm.service" Jan 16 17:56:55.245653 ignition[1019]: INFO : files: op(b): op(c): [started] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 16 17:56:55.248925 ignition[1019]: INFO : files: op(b): op(c): [finished] writing unit "prepare-helm.service" at "/sysroot/etc/systemd/system/prepare-helm.service" Jan 16 17:56:55.248925 ignition[1019]: INFO : files: op(b): [finished] processing unit "prepare-helm.service" Jan 16 17:56:55.250966 ignition[1019]: INFO : files: op(d): [started] processing unit "coreos-metadata.service" Jan 16 17:56:55.250966 ignition[1019]: INFO : files: op(d): op(e): [started] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jan 16 17:56:55.250966 ignition[1019]: INFO : files: op(d): op(e): [finished] writing systemd drop-in "00-custom-metadata.conf" at "/sysroot/etc/systemd/system/coreos-metadata.service.d/00-custom-metadata.conf" Jan 16 17:56:55.250966 ignition[1019]: INFO : files: op(d): [finished] processing unit "coreos-metadata.service" Jan 16 17:56:55.250966 ignition[1019]: INFO : files: op(f): [started] setting preset to enabled for "prepare-helm.service" Jan 16 17:56:55.250966 ignition[1019]: INFO : files: op(f): [finished] setting preset to enabled for "prepare-helm.service" Jan 16 17:56:55.250966 ignition[1019]: INFO : files: createResultFile: createFiles: op(10): [started] writing file "/sysroot/etc/.ignition-result.json" Jan 16 17:56:55.250966 ignition[1019]: INFO : files: createResultFile: createFiles: op(10): [finished] writing file "/sysroot/etc/.ignition-result.json" Jan 16 17:56:55.250966 ignition[1019]: INFO : files: files passed Jan 16 17:56:55.250966 ignition[1019]: INFO : Ignition finished successfully Jan 16 17:56:55.268322 kernel: kauditd_printk_skb: 28 callbacks suppressed Jan 16 17:56:55.268349 kernel: audit: type=1130 audit(1768586215.256:39): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.256000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.253436 systemd[1]: Finished ignition-files.service - Ignition (files). Jan 16 17:56:55.260313 systemd[1]: Starting ignition-quench.service - Ignition (record completion)... Jan 16 17:56:55.267096 systemd[1]: Starting initrd-setup-root-after-ignition.service - Root filesystem completion... Jan 16 17:56:55.276023 systemd[1]: ignition-quench.service: Deactivated successfully. Jan 16 17:56:55.276128 systemd[1]: Finished ignition-quench.service - Ignition (record completion). Jan 16 17:56:55.278000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.278000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.281292 kernel: audit: type=1130 audit(1768586215.278:40): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.281342 kernel: audit: type=1131 audit(1768586215.278:41): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-quench comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.287102 initrd-setup-root-after-ignition[1051]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 16 17:56:55.287102 initrd-setup-root-after-ignition[1051]: grep: /sysroot/usr/share/flatcar/enabled-sysext.conf: No such file or directory Jan 16 17:56:55.290173 initrd-setup-root-after-ignition[1055]: grep: /sysroot/etc/flatcar/enabled-sysext.conf: No such file or directory Jan 16 17:56:55.292750 systemd[1]: Finished initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 16 17:56:55.294000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.294679 systemd[1]: Reached target ignition-complete.target - Ignition Complete. Jan 16 17:56:55.297689 kernel: audit: type=1130 audit(1768586215.294:42): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.300074 systemd[1]: Starting initrd-parse-etc.service - Mountpoints Configured in the Real Root... Jan 16 17:56:55.375014 systemd[1]: initrd-parse-etc.service: Deactivated successfully. Jan 16 17:56:55.375183 systemd[1]: Finished initrd-parse-etc.service - Mountpoints Configured in the Real Root. Jan 16 17:56:55.384048 kernel: audit: type=1130 audit(1768586215.378:43): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.384075 kernel: audit: type=1131 audit(1768586215.378:44): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.378000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.378000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-parse-etc comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.378969 systemd[1]: Reached target initrd-fs.target - Initrd File Systems. Jan 16 17:56:55.383216 systemd[1]: Reached target initrd.target - Initrd Default Target. Jan 16 17:56:55.384821 systemd[1]: dracut-mount.service - dracut mount hook was skipped because no trigger condition checks were met. Jan 16 17:56:55.387273 systemd[1]: Starting dracut-pre-pivot.service - dracut pre-pivot and cleanup hook... Jan 16 17:56:55.414065 systemd[1]: Finished dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 16 17:56:55.415000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.419653 kernel: audit: type=1130 audit(1768586215.415:45): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.420203 systemd[1]: Starting initrd-cleanup.service - Cleaning Up and Shutting Down Daemons... Jan 16 17:56:55.448703 systemd[1]: Unnecessary job was removed for dev-mapper-usr.device - /dev/mapper/usr. Jan 16 17:56:55.449145 systemd[1]: Stopped target nss-lookup.target - Host and Network Name Lookups. Jan 16 17:56:55.450295 systemd[1]: Stopped target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 16 17:56:55.451603 systemd[1]: Stopped target timers.target - Timer Units. Jan 16 17:56:55.452736 systemd[1]: dracut-pre-pivot.service: Deactivated successfully. Jan 16 17:56:55.453000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.452859 systemd[1]: Stopped dracut-pre-pivot.service - dracut pre-pivot and cleanup hook. Jan 16 17:56:55.454335 systemd[1]: Stopped target initrd.target - Initrd Default Target. Jan 16 17:56:55.458353 kernel: audit: type=1131 audit(1768586215.453:46): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-pivot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.456513 systemd[1]: Stopped target basic.target - Basic System. Jan 16 17:56:55.457899 systemd[1]: Stopped target ignition-complete.target - Ignition Complete. Jan 16 17:56:55.459205 systemd[1]: Stopped target ignition-diskful.target - Ignition Boot Disk Setup. Jan 16 17:56:55.460341 systemd[1]: Stopped target initrd-root-device.target - Initrd Root Device. Jan 16 17:56:55.461487 systemd[1]: Stopped target initrd-usr-fs.target - Initrd /usr File System. Jan 16 17:56:55.462703 systemd[1]: Stopped target remote-fs.target - Remote File Systems. Jan 16 17:56:55.463820 systemd[1]: Stopped target remote-fs-pre.target - Preparation for Remote File Systems. Jan 16 17:56:55.465082 systemd[1]: Stopped target sysinit.target - System Initialization. Jan 16 17:56:55.466196 systemd[1]: Stopped target local-fs.target - Local File Systems. Jan 16 17:56:55.467324 systemd[1]: Stopped target swap.target - Swaps. Jan 16 17:56:55.468000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.468255 systemd[1]: dracut-pre-mount.service: Deactivated successfully. Jan 16 17:56:55.472992 kernel: audit: type=1131 audit(1768586215.468:47): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.468390 systemd[1]: Stopped dracut-pre-mount.service - dracut pre-mount hook. Jan 16 17:56:55.469737 systemd[1]: Stopped target cryptsetup.target - Local Encrypted Volumes. Jan 16 17:56:55.472928 systemd[1]: Stopped target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 16 17:56:55.473774 systemd[1]: clevis-luks-askpass.path: Deactivated successfully. Jan 16 17:56:55.475579 systemd[1]: Stopped clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 16 17:56:55.476898 systemd[1]: dracut-initqueue.service: Deactivated successfully. Jan 16 17:56:55.478000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.477041 systemd[1]: Stopped dracut-initqueue.service - dracut initqueue hook. Jan 16 17:56:55.480000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root-after-ignition comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.479235 systemd[1]: initrd-setup-root-after-ignition.service: Deactivated successfully. Jan 16 17:56:55.483177 kernel: audit: type=1131 audit(1768586215.478:48): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-initqueue comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.482000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-files comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.479350 systemd[1]: Stopped initrd-setup-root-after-ignition.service - Root filesystem completion. Jan 16 17:56:55.483000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=flatcar-metadata-hostname comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.481423 systemd[1]: ignition-files.service: Deactivated successfully. Jan 16 17:56:55.481536 systemd[1]: Stopped ignition-files.service - Ignition (files). Jan 16 17:56:55.482651 systemd[1]: flatcar-metadata-hostname.service: Deactivated successfully. Jan 16 17:56:55.482777 systemd[1]: Stopped flatcar-metadata-hostname.service - Flatcar Metadata Hostname Agent. Jan 16 17:56:55.488000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.484871 systemd[1]: Stopping ignition-mount.service - Ignition (mount)... Jan 16 17:56:55.486308 systemd[1]: kmod-static-nodes.service: Deactivated successfully. Jan 16 17:56:55.486435 systemd[1]: Stopped kmod-static-nodes.service - Create List of Static Device Nodes. Jan 16 17:56:55.491739 systemd[1]: Stopping sysroot-boot.service - /sysroot/boot... Jan 16 17:56:55.496000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.495232 systemd[1]: systemd-tmpfiles-setup.service: Deactivated successfully. Jan 16 17:56:55.497000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.495397 systemd[1]: Stopped systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 16 17:56:55.496724 systemd[1]: systemd-udev-trigger.service: Deactivated successfully. Jan 16 17:56:55.496844 systemd[1]: Stopped systemd-udev-trigger.service - Coldplug All udev Devices. Jan 16 17:56:55.498129 systemd[1]: dracut-pre-trigger.service: Deactivated successfully. Jan 16 17:56:55.498246 systemd[1]: Stopped dracut-pre-trigger.service - dracut pre-trigger hook. Jan 16 17:56:55.502000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.512587 systemd[1]: initrd-cleanup.service: Deactivated successfully. Jan 16 17:56:55.512698 systemd[1]: Finished initrd-cleanup.service - Cleaning Up and Shutting Down Daemons. Jan 16 17:56:55.514000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.514000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.518414 ignition[1075]: INFO : Ignition 2.24.0 Jan 16 17:56:55.518414 ignition[1075]: INFO : Stage: umount Jan 16 17:56:55.521127 ignition[1075]: INFO : no configs at "/usr/lib/ignition/base.d" Jan 16 17:56:55.521127 ignition[1075]: INFO : no config dir at "/usr/lib/ignition/base.platform.d/hetzner" Jan 16 17:56:55.521127 ignition[1075]: INFO : umount: umount passed Jan 16 17:56:55.521127 ignition[1075]: INFO : Ignition finished successfully Jan 16 17:56:55.524000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-mount comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.522412 systemd[1]: sysroot-boot.mount: Deactivated successfully. Jan 16 17:56:55.526000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-disks comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.523052 systemd[1]: ignition-mount.service: Deactivated successfully. Jan 16 17:56:55.523169 systemd[1]: Stopped ignition-mount.service - Ignition (mount). Jan 16 17:56:55.528000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-kargs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.525153 systemd[1]: ignition-disks.service: Deactivated successfully. Jan 16 17:56:55.530000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.525253 systemd[1]: Stopped ignition-disks.service - Ignition (disks). Jan 16 17:56:55.526742 systemd[1]: ignition-kargs.service: Deactivated successfully. Jan 16 17:56:55.534000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-fetch-offline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.526791 systemd[1]: Stopped ignition-kargs.service - Ignition (kargs). Jan 16 17:56:55.528691 systemd[1]: ignition-fetch.service: Deactivated successfully. Jan 16 17:56:55.528744 systemd[1]: Stopped ignition-fetch.service - Ignition (fetch). Jan 16 17:56:55.530385 systemd[1]: Stopped target network.target - Network. Jan 16 17:56:55.532071 systemd[1]: ignition-fetch-offline.service: Deactivated successfully. Jan 16 17:56:55.532186 systemd[1]: Stopped ignition-fetch-offline.service - Ignition (fetch-offline). Jan 16 17:56:55.534395 systemd[1]: Stopped target paths.target - Path Units. Jan 16 17:56:55.535263 systemd[1]: systemd-ask-password-console.path: Deactivated successfully. Jan 16 17:56:55.538608 systemd[1]: Stopped systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 16 17:56:55.539489 systemd[1]: Stopped target slices.target - Slice Units. Jan 16 17:56:55.540417 systemd[1]: Stopped target sockets.target - Socket Units. Jan 16 17:56:55.541513 systemd[1]: iscsid.socket: Deactivated successfully. Jan 16 17:56:55.541576 systemd[1]: Closed iscsid.socket - Open-iSCSI iscsid Socket. Jan 16 17:56:55.545000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.543408 systemd[1]: iscsiuio.socket: Deactivated successfully. Jan 16 17:56:55.547000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=ignition-setup-pre comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.543444 systemd[1]: Closed iscsiuio.socket - Open-iSCSI iscsiuio Socket. Jan 16 17:56:55.544325 systemd[1]: systemd-journald-audit.socket: Deactivated successfully. Jan 16 17:56:55.544353 systemd[1]: Closed systemd-journald-audit.socket - Journal Audit Socket. Jan 16 17:56:55.551000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=sysroot-boot comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.545253 systemd[1]: ignition-setup.service: Deactivated successfully. Jan 16 17:56:55.545311 systemd[1]: Stopped ignition-setup.service - Ignition (setup). Jan 16 17:56:55.553000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-setup-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.546161 systemd[1]: ignition-setup-pre.service: Deactivated successfully. Jan 16 17:56:55.546203 systemd[1]: Stopped ignition-setup-pre.service - Ignition env setup. Jan 16 17:56:55.547270 systemd[1]: Stopping systemd-networkd.service - Network Configuration... Jan 16 17:56:55.548199 systemd[1]: Stopping systemd-resolved.service - Network Name Resolution... Jan 16 17:56:55.550837 systemd[1]: sysroot-boot.service: Deactivated successfully. Jan 16 17:56:55.550992 systemd[1]: Stopped sysroot-boot.service - /sysroot/boot. Jan 16 17:56:55.558000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.552097 systemd[1]: initrd-setup-root.service: Deactivated successfully. Jan 16 17:56:55.552186 systemd[1]: Stopped initrd-setup-root.service - Root filesystem setup. Jan 16 17:56:55.557261 systemd[1]: systemd-resolved.service: Deactivated successfully. Jan 16 17:56:55.560000 audit: BPF prog-id=6 op=UNLOAD Jan 16 17:56:55.557382 systemd[1]: Stopped systemd-resolved.service - Network Name Resolution. Jan 16 17:56:55.562978 systemd[1]: systemd-networkd.service: Deactivated successfully. Jan 16 17:56:55.563000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.563708 systemd[1]: Stopped systemd-networkd.service - Network Configuration. Jan 16 17:56:55.566000 audit: BPF prog-id=9 op=UNLOAD Jan 16 17:56:55.567237 systemd[1]: Stopped target network-pre.target - Preparation for Network. Jan 16 17:56:55.568936 systemd[1]: systemd-networkd.socket: Deactivated successfully. Jan 16 17:56:55.569000 systemd[1]: Closed systemd-networkd.socket - Network Service Netlink Socket. Jan 16 17:56:55.571870 systemd[1]: Stopping network-cleanup.service - Network Cleanup... Jan 16 17:56:55.573000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=parse-ip-for-networkd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.572710 systemd[1]: parse-ip-for-networkd.service: Deactivated successfully. Jan 16 17:56:55.575000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.572799 systemd[1]: Stopped parse-ip-for-networkd.service - Write systemd-networkd units from cmdline. Jan 16 17:56:55.575000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.573860 systemd[1]: systemd-sysctl.service: Deactivated successfully. Jan 16 17:56:55.573980 systemd[1]: Stopped systemd-sysctl.service - Apply Kernel Variables. Jan 16 17:56:55.575151 systemd[1]: systemd-modules-load.service: Deactivated successfully. Jan 16 17:56:55.575206 systemd[1]: Stopped systemd-modules-load.service - Load Kernel Modules. Jan 16 17:56:55.576374 systemd[1]: Stopping systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 16 17:56:55.590999 systemd[1]: systemd-udevd.service: Deactivated successfully. Jan 16 17:56:55.592619 systemd[1]: Stopped systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 16 17:56:55.596000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-udevd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.596966 systemd[1]: systemd-udevd-control.socket: Deactivated successfully. Jan 16 17:56:55.597020 systemd[1]: Closed systemd-udevd-control.socket - udev Control Socket. Jan 16 17:56:55.600680 systemd[1]: systemd-udevd-kernel.socket: Deactivated successfully. Jan 16 17:56:55.601486 systemd[1]: Closed systemd-udevd-kernel.socket - udev Kernel Socket. Jan 16 17:56:55.603293 systemd[1]: dracut-pre-udev.service: Deactivated successfully. Jan 16 17:56:55.603000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-pre-udev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.603363 systemd[1]: Stopped dracut-pre-udev.service - dracut pre-udev hook. Jan 16 17:56:55.604000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.604841 systemd[1]: dracut-cmdline.service: Deactivated successfully. Jan 16 17:56:55.606000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=dracut-cmdline-ask comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.604902 systemd[1]: Stopped dracut-cmdline.service - dracut cmdline hook. Jan 16 17:56:55.606243 systemd[1]: dracut-cmdline-ask.service: Deactivated successfully. Jan 16 17:56:55.606289 systemd[1]: Stopped dracut-cmdline-ask.service - dracut ask for additional cmdline parameters. Jan 16 17:56:55.608349 systemd[1]: Starting initrd-udevadm-cleanup-db.service - Cleanup udev Database... Jan 16 17:56:55.611000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.613000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.610157 systemd[1]: systemd-network-generator.service: Deactivated successfully. Jan 16 17:56:55.610224 systemd[1]: Stopped systemd-network-generator.service - Generate network units from Kernel command line. Jan 16 17:56:55.612329 systemd[1]: systemd-tmpfiles-setup-dev.service: Deactivated successfully. Jan 16 17:56:55.617000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=systemd-vconsole-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.612383 systemd[1]: Stopped systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 16 17:56:55.616290 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 16 17:56:55.616351 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 16 17:56:55.633733 systemd[1]: initrd-udevadm-cleanup-db.service: Deactivated successfully. Jan 16 17:56:55.637029 systemd[1]: Finished initrd-udevadm-cleanup-db.service - Cleanup udev Database. Jan 16 17:56:55.637000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.637000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=initrd-udevadm-cleanup-db comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.641235 systemd[1]: network-cleanup.service: Deactivated successfully. Jan 16 17:56:55.641355 systemd[1]: Stopped network-cleanup.service - Network Cleanup. Jan 16 17:56:55.642000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=kernel msg='unit=network-cleanup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:55.643178 systemd[1]: Reached target initrd-switch-root.target - Switch Root. Jan 16 17:56:55.645145 systemd[1]: Starting initrd-switch-root.service - Switch Root... Jan 16 17:56:55.669062 systemd[1]: Switching root. Jan 16 17:56:55.705237 systemd-journald[349]: Journal stopped Jan 16 17:56:56.693375 systemd-journald[349]: Received SIGTERM from PID 1 (systemd). Jan 16 17:56:56.693431 kernel: SELinux: policy capability network_peer_controls=1 Jan 16 17:56:56.693447 kernel: SELinux: policy capability open_perms=1 Jan 16 17:56:56.693460 kernel: SELinux: policy capability extended_socket_class=1 Jan 16 17:56:56.693470 kernel: SELinux: policy capability always_check_network=0 Jan 16 17:56:56.693486 kernel: SELinux: policy capability cgroup_seclabel=1 Jan 16 17:56:56.693499 kernel: SELinux: policy capability nnp_nosuid_transition=1 Jan 16 17:56:56.693512 kernel: SELinux: policy capability genfs_seclabel_symlinks=0 Jan 16 17:56:56.693521 kernel: SELinux: policy capability ioctl_skip_cloexec=0 Jan 16 17:56:56.693535 kernel: SELinux: policy capability userspace_initial_context=0 Jan 16 17:56:56.693584 systemd[1]: Successfully loaded SELinux policy in 61.975ms. Jan 16 17:56:56.693604 systemd[1]: Relabeled /dev/, /dev/shm/, /run/ in 6.258ms. Jan 16 17:56:56.693617 systemd[1]: systemd 257.9 running in system mode (+PAM +AUDIT +SELINUX -APPARMOR +IMA +IPE +SMACK +SECCOMP -GCRYPT -GNUTLS +OPENSSL -ACL +BLKID +CURL +ELFUTILS -FIDO2 +IDN2 -IDN +IPTC +KMOD +LIBCRYPTSETUP +LIBCRYPTSETUP_PLUGINS +LIBFDISK +PCRE2 -PWQUALITY -P11KIT -QRENCODE +TPM2 +BZIP2 +LZ4 +XZ +ZLIB +ZSTD -BPF_FRAMEWORK -BTF -XKBCOMMON +UTMP -SYSVINIT +LIBARCHIVE) Jan 16 17:56:56.693629 systemd[1]: Detected virtualization kvm. Jan 16 17:56:56.693640 systemd[1]: Detected architecture arm64. Jan 16 17:56:56.693652 systemd[1]: Detected first boot. Jan 16 17:56:56.693663 systemd[1]: Hostname set to . Jan 16 17:56:56.693674 systemd[1]: Initializing machine ID from SMBIOS/DMI UUID. Jan 16 17:56:56.693685 zram_generator::config[1118]: No configuration found. Jan 16 17:56:56.693700 kernel: NET: Registered PF_VSOCK protocol family Jan 16 17:56:56.693710 systemd[1]: Populated /etc with preset unit settings. Jan 16 17:56:56.693720 systemd[1]: initrd-switch-root.service: Deactivated successfully. Jan 16 17:56:56.693732 systemd[1]: Stopped initrd-switch-root.service - Switch Root. Jan 16 17:56:56.693743 systemd[1]: systemd-journald.service: Scheduled restart job, restart counter is at 1. Jan 16 17:56:56.693757 systemd[1]: Created slice system-addon\x2dconfig.slice - Slice /system/addon-config. Jan 16 17:56:56.693769 systemd[1]: Created slice system-addon\x2drun.slice - Slice /system/addon-run. Jan 16 17:56:56.693780 systemd[1]: Created slice system-getty.slice - Slice /system/getty. Jan 16 17:56:56.693791 systemd[1]: Created slice system-modprobe.slice - Slice /system/modprobe. Jan 16 17:56:56.693802 systemd[1]: Created slice system-serial\x2dgetty.slice - Slice /system/serial-getty. Jan 16 17:56:56.693817 systemd[1]: Created slice system-system\x2dcloudinit.slice - Slice /system/system-cloudinit. Jan 16 17:56:56.693828 systemd[1]: Created slice system-systemd\x2dfsck.slice - Slice /system/systemd-fsck. Jan 16 17:56:56.693840 systemd[1]: Created slice user.slice - User and Session Slice. Jan 16 17:56:56.693851 systemd[1]: Started clevis-luks-askpass.path - Forward Password Requests to Clevis Directory Watch. Jan 16 17:56:56.693862 systemd[1]: Started systemd-ask-password-console.path - Dispatch Password Requests to Console Directory Watch. Jan 16 17:56:56.693872 systemd[1]: Started systemd-ask-password-wall.path - Forward Password Requests to Wall Directory Watch. Jan 16 17:56:56.693917 systemd[1]: Set up automount boot.automount - Boot partition Automount Point. Jan 16 17:56:56.693934 systemd[1]: Set up automount proc-sys-fs-binfmt_misc.automount - Arbitrary Executable File Formats File System Automount Point. Jan 16 17:56:56.693946 systemd[1]: Expecting device dev-disk-by\x2dlabel-OEM.device - /dev/disk/by-label/OEM... Jan 16 17:56:56.693957 systemd[1]: Expecting device dev-ttyAMA0.device - /dev/ttyAMA0... Jan 16 17:56:56.693968 systemd[1]: Reached target cryptsetup-pre.target - Local Encrypted Volumes (Pre). Jan 16 17:56:56.693979 systemd[1]: Reached target cryptsetup.target - Local Encrypted Volumes. Jan 16 17:56:56.693992 systemd[1]: Stopped target initrd-switch-root.target - Switch Root. Jan 16 17:56:56.694003 systemd[1]: Stopped target initrd-fs.target - Initrd File Systems. Jan 16 17:56:56.694014 systemd[1]: Stopped target initrd-root-fs.target - Initrd Root File System. Jan 16 17:56:56.694025 systemd[1]: Reached target integritysetup.target - Local Integrity Protected Volumes. Jan 16 17:56:56.694036 systemd[1]: Reached target remote-cryptsetup.target - Remote Encrypted Volumes. Jan 16 17:56:56.694047 systemd[1]: Reached target remote-fs.target - Remote File Systems. Jan 16 17:56:56.694059 systemd[1]: Reached target remote-veritysetup.target - Remote Verity Protected Volumes. Jan 16 17:56:56.694071 systemd[1]: Reached target slices.target - Slice Units. Jan 16 17:56:56.694085 systemd[1]: Reached target swap.target - Swaps. Jan 16 17:56:56.694096 systemd[1]: Reached target veritysetup.target - Local Verity Protected Volumes. Jan 16 17:56:56.694107 systemd[1]: Listening on systemd-coredump.socket - Process Core Dump Socket. Jan 16 17:56:56.694118 systemd[1]: Listening on systemd-creds.socket - Credential Encryption/Decryption. Jan 16 17:56:56.694129 systemd[1]: Listening on systemd-journald-audit.socket - Journal Audit Socket. Jan 16 17:56:56.694140 systemd[1]: Listening on systemd-mountfsd.socket - DDI File System Mounter Socket. Jan 16 17:56:56.694153 systemd[1]: Listening on systemd-networkd.socket - Network Service Netlink Socket. Jan 16 17:56:56.694165 systemd[1]: Listening on systemd-nsresourced.socket - Namespace Resource Manager Socket. Jan 16 17:56:56.694177 systemd[1]: Listening on systemd-oomd.socket - Userspace Out-Of-Memory (OOM) Killer Socket. Jan 16 17:56:56.694188 systemd[1]: Listening on systemd-udevd-control.socket - udev Control Socket. Jan 16 17:56:56.694200 systemd[1]: Listening on systemd-udevd-kernel.socket - udev Kernel Socket. Jan 16 17:56:56.694211 systemd[1]: Listening on systemd-userdbd.socket - User Database Manager Socket. Jan 16 17:56:56.694221 systemd[1]: Mounting dev-hugepages.mount - Huge Pages File System... Jan 16 17:56:56.694234 systemd[1]: Mounting dev-mqueue.mount - POSIX Message Queue File System... Jan 16 17:56:56.694246 systemd[1]: Mounting media.mount - External Media Directory... Jan 16 17:56:56.694257 systemd[1]: Mounting sys-kernel-debug.mount - Kernel Debug File System... Jan 16 17:56:56.694267 systemd[1]: Mounting sys-kernel-tracing.mount - Kernel Trace File System... Jan 16 17:56:56.694277 systemd[1]: Mounting tmp.mount - Temporary Directory /tmp... Jan 16 17:56:56.694288 systemd[1]: var-lib-machines.mount - Virtual Machine and Container Storage (Compatibility) was skipped because of an unmet condition check (ConditionPathExists=/var/lib/machines.raw). Jan 16 17:56:56.694299 systemd[1]: Reached target machines.target - Containers. Jan 16 17:56:56.694311 systemd[1]: Starting flatcar-tmpfiles.service - Create missing system files... Jan 16 17:56:56.694322 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 16 17:56:56.694333 systemd[1]: Starting kmod-static-nodes.service - Create List of Static Device Nodes... Jan 16 17:56:56.694344 systemd[1]: Starting modprobe@configfs.service - Load Kernel Module configfs... Jan 16 17:56:56.694354 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 16 17:56:56.694365 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 16 17:56:56.694383 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 16 17:56:56.694398 systemd[1]: Starting modprobe@fuse.service - Load Kernel Module fuse... Jan 16 17:56:56.694409 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 16 17:56:56.694420 systemd[1]: setup-nsswitch.service - Create /etc/nsswitch.conf was skipped because of an unmet condition check (ConditionPathExists=!/etc/nsswitch.conf). Jan 16 17:56:56.694433 systemd[1]: systemd-fsck-root.service: Deactivated successfully. Jan 16 17:56:56.694445 systemd[1]: Stopped systemd-fsck-root.service - File System Check on Root Device. Jan 16 17:56:56.694456 systemd[1]: systemd-fsck-usr.service: Deactivated successfully. Jan 16 17:56:56.694467 systemd[1]: Stopped systemd-fsck-usr.service. Jan 16 17:56:56.694480 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 16 17:56:56.694492 systemd[1]: Starting systemd-journald.service - Journal Service... Jan 16 17:56:56.694503 systemd[1]: Starting systemd-modules-load.service - Load Kernel Modules... Jan 16 17:56:56.694515 systemd[1]: Starting systemd-network-generator.service - Generate network units from Kernel command line... Jan 16 17:56:56.694528 systemd[1]: Starting systemd-remount-fs.service - Remount Root and Kernel File Systems... Jan 16 17:56:56.694539 systemd[1]: Starting systemd-udev-load-credentials.service - Load udev Rules from Credentials... Jan 16 17:56:56.694561 systemd[1]: Starting systemd-udev-trigger.service - Coldplug All udev Devices... Jan 16 17:56:56.694573 systemd[1]: Mounted dev-hugepages.mount - Huge Pages File System. Jan 16 17:56:56.694583 systemd[1]: Mounted dev-mqueue.mount - POSIX Message Queue File System. Jan 16 17:56:56.694596 systemd[1]: Mounted media.mount - External Media Directory. Jan 16 17:56:56.694608 systemd[1]: Mounted sys-kernel-debug.mount - Kernel Debug File System. Jan 16 17:56:56.694619 systemd[1]: Mounted sys-kernel-tracing.mount - Kernel Trace File System. Jan 16 17:56:56.694630 systemd[1]: Mounted tmp.mount - Temporary Directory /tmp. Jan 16 17:56:56.694640 systemd[1]: Finished kmod-static-nodes.service - Create List of Static Device Nodes. Jan 16 17:56:56.694654 systemd[1]: modprobe@configfs.service: Deactivated successfully. Jan 16 17:56:56.694665 systemd[1]: Finished modprobe@configfs.service - Load Kernel Module configfs. Jan 16 17:56:56.694675 kernel: fuse: init (API version 7.41) Jan 16 17:56:56.694685 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 16 17:56:56.694696 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 16 17:56:56.694706 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 16 17:56:56.694717 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 16 17:56:56.694729 systemd[1]: modprobe@fuse.service: Deactivated successfully. Jan 16 17:56:56.694740 systemd[1]: Finished modprobe@fuse.service - Load Kernel Module fuse. Jan 16 17:56:56.694750 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 16 17:56:56.694761 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 16 17:56:56.694772 systemd[1]: Finished flatcar-tmpfiles.service - Create missing system files. Jan 16 17:56:56.694783 systemd[1]: Finished systemd-modules-load.service - Load Kernel Modules. Jan 16 17:56:56.694794 systemd[1]: Finished systemd-network-generator.service - Generate network units from Kernel command line. Jan 16 17:56:56.694806 systemd[1]: Finished systemd-remount-fs.service - Remount Root and Kernel File Systems. Jan 16 17:56:56.694817 kernel: ACPI: bus type drm_connector registered Jan 16 17:56:56.694828 systemd[1]: Reached target network-pre.target - Preparation for Network. Jan 16 17:56:56.694838 systemd[1]: Listening on systemd-importd.socket - Disk Image Download Service Socket. Jan 16 17:56:56.694850 systemd[1]: remount-root.service - Remount Root File System was skipped because of an unmet condition check (ConditionPathIsReadWrite=!/). Jan 16 17:56:56.694861 systemd[1]: Reached target local-fs.target - Local File Systems. Jan 16 17:56:56.697520 systemd[1]: Listening on systemd-sysext.socket - System Extension Image Management. Jan 16 17:56:56.697619 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 16 17:56:56.697638 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 16 17:56:56.697649 systemd[1]: Starting systemd-hwdb-update.service - Rebuild Hardware Database... Jan 16 17:56:56.697661 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 16 17:56:56.697707 systemd-journald[1193]: Collecting audit messages is enabled. Jan 16 17:56:56.697734 systemd[1]: Starting systemd-random-seed.service - Load/Save OS Random Seed... Jan 16 17:56:56.697749 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 16 17:56:56.697761 systemd-journald[1193]: Journal started Jan 16 17:56:56.697784 systemd-journald[1193]: Runtime Journal (/run/log/journal/2975e9ee83bd40498bdd8e8c9bfab007) is 8M, max 76.5M, 68.5M free. Jan 16 17:56:56.553000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-root comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.556000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-fsck-usr comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.558000 audit: BPF prog-id=14 op=UNLOAD Jan 16 17:56:56.558000 audit: BPF prog-id=13 op=UNLOAD Jan 16 17:56:56.564000 audit: BPF prog-id=15 op=LOAD Jan 16 17:56:56.564000 audit: BPF prog-id=16 op=LOAD Jan 16 17:56:56.564000 audit: BPF prog-id=17 op=LOAD Jan 16 17:56:56.625000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kmod-static-nodes comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.628000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.628000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@configfs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.634000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.634000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.643000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.643000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.648000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.648000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@fuse comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.652000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.652000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.654000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=flatcar-tmpfiles comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.660000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-modules-load comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.662000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-network-generator comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.664000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-remount-fs comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.690000 audit: CONFIG_CHANGE op=set audit_enabled=1 old=1 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 res=1 Jan 16 17:56:56.690000 audit[1193]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=60 a0=4 a1=ffffe6717c00 a2=4000 a3=0 items=0 ppid=1 pid=1193 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="systemd-journal" exe="/usr/lib/systemd/systemd-journald" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:56:56.690000 audit: PROCTITLE proctitle="/usr/lib/systemd/systemd-journald" Jan 16 17:56:56.370079 systemd[1]: Queued start job for default target multi-user.target. Jan 16 17:56:56.386971 systemd[1]: Unnecessary job was removed for dev-sda6.device - /dev/sda6. Jan 16 17:56:56.387717 systemd[1]: systemd-journald.service: Deactivated successfully. Jan 16 17:56:56.711578 systemd[1]: Starting systemd-sysctl.service - Apply Kernel Variables... Jan 16 17:56:56.719122 systemd[1]: Starting systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/... Jan 16 17:56:56.722570 systemd[1]: Starting systemd-sysusers.service - Create System Users... Jan 16 17:56:56.726571 systemd[1]: Started systemd-journald.service - Journal Service. Jan 16 17:56:56.725000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journald comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.727808 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 16 17:56:56.733000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.733000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.736000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-load-credentials comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.737000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-random-seed comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.732520 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 16 17:56:56.734749 systemd[1]: Finished systemd-udev-load-credentials.service - Load udev Rules from Credentials. Jan 16 17:56:56.737071 systemd[1]: Finished systemd-random-seed.service - Load/Save OS Random Seed. Jan 16 17:56:56.758431 systemd[1]: Reached target first-boot-complete.target - First Boot Complete. Jan 16 17:56:56.760604 kernel: loop1: detected capacity change from 0 to 8 Jan 16 17:56:56.763798 systemd[1]: Starting systemd-journal-flush.service - Flush Journal to Persistent Storage... Jan 16 17:56:56.767812 systemd[1]: Starting systemd-machine-id-commit.service - Save Transient machine-id to Disk... Jan 16 17:56:56.777032 systemd[1]: Finished systemd-udev-trigger.service - Coldplug All udev Devices. Jan 16 17:56:56.778000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-udev-trigger comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.782336 systemd[1]: Finished systemd-sysctl.service - Apply Kernel Variables. Jan 16 17:56:56.784000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysctl comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.801591 kernel: loop2: detected capacity change from 0 to 100192 Jan 16 17:56:56.799000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysusers comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.800000 audit: BPF prog-id=18 op=LOAD Jan 16 17:56:56.801795 systemd-journald[1193]: Time spent on flushing to /var/log/journal/2975e9ee83bd40498bdd8e8c9bfab007 is 63.586ms for 1298 entries. Jan 16 17:56:56.801795 systemd-journald[1193]: System Journal (/var/log/journal/2975e9ee83bd40498bdd8e8c9bfab007) is 8M, max 588.1M, 580.1M free. Jan 16 17:56:56.889674 systemd-journald[1193]: Received client request to flush runtime journal. Jan 16 17:56:56.890051 kernel: loop3: detected capacity change from 0 to 45344 Jan 16 17:56:56.890092 kernel: loop4: detected capacity change from 0 to 200800 Jan 16 17:56:56.800000 audit: BPF prog-id=19 op=LOAD Jan 16 17:56:56.800000 audit: BPF prog-id=20 op=LOAD Jan 16 17:56:56.807000 audit: BPF prog-id=21 op=LOAD Jan 16 17:56:56.817000 audit: BPF prog-id=22 op=LOAD Jan 16 17:56:56.817000 audit: BPF prog-id=23 op=LOAD Jan 16 17:56:56.817000 audit: BPF prog-id=24 op=LOAD Jan 16 17:56:56.828000 audit: BPF prog-id=25 op=LOAD Jan 16 17:56:56.828000 audit: BPF prog-id=26 op=LOAD Jan 16 17:56:56.828000 audit: BPF prog-id=27 op=LOAD Jan 16 17:56:56.851000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-machine-id-commit comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.883000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-nsresourced comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.893000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-flush comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.798774 systemd[1]: Finished systemd-sysusers.service - Create System Users. Jan 16 17:56:56.806850 systemd[1]: Starting systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer... Jan 16 17:56:56.809969 systemd[1]: Starting systemd-resolved.service - Network Name Resolution... Jan 16 17:56:56.813588 systemd[1]: Starting systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev... Jan 16 17:56:56.819933 systemd[1]: Starting systemd-nsresourced.service - Namespace Resource Manager... Jan 16 17:56:56.830321 systemd[1]: Starting systemd-userdbd.service - User Database Manager... Jan 16 17:56:56.849624 systemd[1]: Finished systemd-machine-id-commit.service - Save Transient machine-id to Disk. Jan 16 17:56:56.878260 systemd-nsresourced[1254]: Not setting up BPF subsystem, as functionality has been disabled at compile time. Jan 16 17:56:56.880746 systemd[1]: Started systemd-nsresourced.service - Namespace Resource Manager. Jan 16 17:56:56.892517 systemd[1]: Finished systemd-journal-flush.service - Flush Journal to Persistent Storage. Jan 16 17:56:56.898800 systemd-tmpfiles[1253]: ACLs are not supported, ignoring. Jan 16 17:56:56.898814 systemd-tmpfiles[1253]: ACLs are not supported, ignoring. Jan 16 17:56:56.912000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup-dev comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.911032 systemd[1]: Finished systemd-tmpfiles-setup-dev.service - Create Static Device Nodes in /dev. Jan 16 17:56:56.922626 kernel: loop5: detected capacity change from 0 to 8 Jan 16 17:56:56.926592 kernel: loop6: detected capacity change from 0 to 100192 Jan 16 17:56:56.937649 systemd[1]: Started systemd-userdbd.service - User Database Manager. Jan 16 17:56:56.939000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-userdbd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:56.949345 kernel: loop7: detected capacity change from 0 to 45344 Jan 16 17:56:56.961592 kernel: loop1: detected capacity change from 0 to 200800 Jan 16 17:56:56.984797 (sd-merge)[1271]: Using extensions 'containerd-flatcar.raw', 'docker-flatcar.raw', 'kubernetes.raw', 'oem-hetzner.raw'. Jan 16 17:56:56.992287 (sd-merge)[1271]: Merged extensions into '/usr'. Jan 16 17:56:57.003759 systemd[1]: Reload requested from client PID 1217 ('systemd-sysext') (unit systemd-sysext.service)... Jan 16 17:56:57.003778 systemd[1]: Reloading... Jan 16 17:56:57.013277 systemd-oomd[1250]: No swap; memory pressure usage will be degraded Jan 16 17:56:57.053453 systemd-resolved[1252]: Positive Trust Anchors: Jan 16 17:56:57.053849 systemd-resolved[1252]: . IN DS 20326 8 2 e06d44b80b8f1d39a95c0b0d7c65d08458e880409bbc683457104237c7f8ec8d Jan 16 17:56:57.053954 systemd-resolved[1252]: . IN DS 38696 8 2 683d2d0acb8c9b712a1948b27f741219298d0a450d612c483af444a4c0fb2b16 Jan 16 17:56:57.054028 systemd-resolved[1252]: Negative trust anchors: home.arpa 10.in-addr.arpa 16.172.in-addr.arpa 17.172.in-addr.arpa 18.172.in-addr.arpa 19.172.in-addr.arpa 20.172.in-addr.arpa 21.172.in-addr.arpa 22.172.in-addr.arpa 23.172.in-addr.arpa 24.172.in-addr.arpa 25.172.in-addr.arpa 26.172.in-addr.arpa 27.172.in-addr.arpa 28.172.in-addr.arpa 29.172.in-addr.arpa 30.172.in-addr.arpa 31.172.in-addr.arpa 170.0.0.192.in-addr.arpa 171.0.0.192.in-addr.arpa 168.192.in-addr.arpa d.f.ip6.arpa ipv4only.arpa resolver.arpa corp home internal intranet lan local private test Jan 16 17:56:57.065280 systemd-resolved[1252]: Using system hostname 'ci-4580-0-0-p-03fd9ab712'. Jan 16 17:56:57.098576 zram_generator::config[1311]: No configuration found. Jan 16 17:56:57.278217 systemd[1]: etc-machine\x2did.mount: Deactivated successfully. Jan 16 17:56:57.278385 systemd[1]: Reloading finished in 274 ms. Jan 16 17:56:57.296193 systemd[1]: Started systemd-oomd.service - Userspace Out-Of-Memory (OOM) Killer. Jan 16 17:56:57.296000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-oomd comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:57.297195 systemd[1]: Started systemd-resolved.service - Network Name Resolution. Jan 16 17:56:57.297000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-resolved comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:57.298281 systemd[1]: Finished systemd-sysext.service - Merge System Extension Images into /usr/ and /opt/. Jan 16 17:56:57.298000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:57.304222 systemd[1]: Reached target nss-lookup.target - Host and Network Name Lookups. Jan 16 17:56:57.311813 systemd[1]: Starting ensure-sysext.service... Jan 16 17:56:57.315725 systemd[1]: Starting systemd-tmpfiles-setup.service - Create System Files and Directories... Jan 16 17:56:57.317000 audit: BPF prog-id=28 op=LOAD Jan 16 17:56:57.317000 audit: BPF prog-id=15 op=UNLOAD Jan 16 17:56:57.317000 audit: BPF prog-id=29 op=LOAD Jan 16 17:56:57.317000 audit: BPF prog-id=30 op=LOAD Jan 16 17:56:57.317000 audit: BPF prog-id=16 op=UNLOAD Jan 16 17:56:57.317000 audit: BPF prog-id=17 op=UNLOAD Jan 16 17:56:57.318000 audit: BPF prog-id=31 op=LOAD Jan 16 17:56:57.318000 audit: BPF prog-id=22 op=UNLOAD Jan 16 17:56:57.318000 audit: BPF prog-id=32 op=LOAD Jan 16 17:56:57.318000 audit: BPF prog-id=33 op=LOAD Jan 16 17:56:57.318000 audit: BPF prog-id=23 op=UNLOAD Jan 16 17:56:57.318000 audit: BPF prog-id=24 op=UNLOAD Jan 16 17:56:57.319000 audit: BPF prog-id=34 op=LOAD Jan 16 17:56:57.319000 audit: BPF prog-id=18 op=UNLOAD Jan 16 17:56:57.321000 audit: BPF prog-id=35 op=LOAD Jan 16 17:56:57.321000 audit: BPF prog-id=36 op=LOAD Jan 16 17:56:57.321000 audit: BPF prog-id=19 op=UNLOAD Jan 16 17:56:57.321000 audit: BPF prog-id=20 op=UNLOAD Jan 16 17:56:57.322000 audit: BPF prog-id=37 op=LOAD Jan 16 17:56:57.322000 audit: BPF prog-id=25 op=UNLOAD Jan 16 17:56:57.322000 audit: BPF prog-id=38 op=LOAD Jan 16 17:56:57.322000 audit: BPF prog-id=39 op=LOAD Jan 16 17:56:57.322000 audit: BPF prog-id=26 op=UNLOAD Jan 16 17:56:57.322000 audit: BPF prog-id=27 op=UNLOAD Jan 16 17:56:57.322000 audit: BPF prog-id=40 op=LOAD Jan 16 17:56:57.322000 audit: BPF prog-id=21 op=UNLOAD Jan 16 17:56:57.355989 systemd[1]: Reload requested from client PID 1344 ('systemctl') (unit ensure-sysext.service)... Jan 16 17:56:57.356006 systemd[1]: Reloading... Jan 16 17:56:57.373504 systemd-tmpfiles[1345]: /usr/lib/tmpfiles.d/nfs-utils.conf:6: Duplicate line for path "/var/lib/nfs/sm", ignoring. Jan 16 17:56:57.374910 systemd-tmpfiles[1345]: /usr/lib/tmpfiles.d/nfs-utils.conf:7: Duplicate line for path "/var/lib/nfs/sm.bak", ignoring. Jan 16 17:56:57.375174 systemd-tmpfiles[1345]: /usr/lib/tmpfiles.d/provision.conf:20: Duplicate line for path "/root", ignoring. Jan 16 17:56:57.376116 systemd-tmpfiles[1345]: ACLs are not supported, ignoring. Jan 16 17:56:57.376171 systemd-tmpfiles[1345]: ACLs are not supported, ignoring. Jan 16 17:56:57.384826 systemd-tmpfiles[1345]: Detected autofs mount point /boot during canonicalization of boot. Jan 16 17:56:57.384840 systemd-tmpfiles[1345]: Skipping /boot Jan 16 17:56:57.400466 systemd-tmpfiles[1345]: Detected autofs mount point /boot during canonicalization of boot. Jan 16 17:56:57.400478 systemd-tmpfiles[1345]: Skipping /boot Jan 16 17:56:57.444577 zram_generator::config[1373]: No configuration found. Jan 16 17:56:57.608305 systemd[1]: Reloading finished in 252 ms. Jan 16 17:56:57.637405 systemd[1]: Finished systemd-hwdb-update.service - Rebuild Hardware Database. Jan 16 17:56:57.638000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-hwdb-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:57.641000 audit: BPF prog-id=41 op=LOAD Jan 16 17:56:57.641000 audit: BPF prog-id=34 op=UNLOAD Jan 16 17:56:57.641000 audit: BPF prog-id=42 op=LOAD Jan 16 17:56:57.641000 audit: BPF prog-id=43 op=LOAD Jan 16 17:56:57.641000 audit: BPF prog-id=35 op=UNLOAD Jan 16 17:56:57.641000 audit: BPF prog-id=36 op=UNLOAD Jan 16 17:56:57.642000 audit: BPF prog-id=44 op=LOAD Jan 16 17:56:57.642000 audit: BPF prog-id=28 op=UNLOAD Jan 16 17:56:57.642000 audit: BPF prog-id=45 op=LOAD Jan 16 17:56:57.642000 audit: BPF prog-id=46 op=LOAD Jan 16 17:56:57.642000 audit: BPF prog-id=29 op=UNLOAD Jan 16 17:56:57.642000 audit: BPF prog-id=30 op=UNLOAD Jan 16 17:56:57.644000 audit: BPF prog-id=47 op=LOAD Jan 16 17:56:57.648000 audit: BPF prog-id=31 op=UNLOAD Jan 16 17:56:57.648000 audit: BPF prog-id=48 op=LOAD Jan 16 17:56:57.648000 audit: BPF prog-id=49 op=LOAD Jan 16 17:56:57.648000 audit: BPF prog-id=32 op=UNLOAD Jan 16 17:56:57.648000 audit: BPF prog-id=33 op=UNLOAD Jan 16 17:56:57.649000 audit: BPF prog-id=50 op=LOAD Jan 16 17:56:57.649000 audit: BPF prog-id=37 op=UNLOAD Jan 16 17:56:57.649000 audit: BPF prog-id=51 op=LOAD Jan 16 17:56:57.649000 audit: BPF prog-id=52 op=LOAD Jan 16 17:56:57.649000 audit: BPF prog-id=38 op=UNLOAD Jan 16 17:56:57.649000 audit: BPF prog-id=39 op=UNLOAD Jan 16 17:56:57.651000 audit: BPF prog-id=53 op=LOAD Jan 16 17:56:57.651000 audit: BPF prog-id=40 op=UNLOAD Jan 16 17:56:57.655256 systemd[1]: Finished systemd-tmpfiles-setup.service - Create System Files and Directories. Jan 16 17:56:57.656000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-tmpfiles-setup comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:57.664267 systemd[1]: Mounting sys-fs-fuse-connections.mount - FUSE Control File System... Jan 16 17:56:57.666156 systemd[1]: Mounting sys-kernel-config.mount - Kernel Configuration File System... Jan 16 17:56:57.672314 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 16 17:56:57.678355 systemd[1]: Starting clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs... Jan 16 17:56:57.684805 systemd[1]: Starting ldconfig.service - Rebuild Dynamic Linker Cache... Jan 16 17:56:57.687004 systemd[1]: Starting systemd-journal-catalog-update.service - Rebuild Journal Catalog... Jan 16 17:56:57.688000 audit: BPF prog-id=8 op=UNLOAD Jan 16 17:56:57.688000 audit: BPF prog-id=7 op=UNLOAD Jan 16 17:56:57.692000 audit: BPF prog-id=54 op=LOAD Jan 16 17:56:57.692000 audit: BPF prog-id=55 op=LOAD Jan 16 17:56:57.695005 systemd[1]: Starting systemd-udevd.service - Rule-based Manager for Device Events and Files... Jan 16 17:56:57.699271 systemd[1]: Starting systemd-update-utmp.service - Record System Boot/Shutdown in UTMP... Jan 16 17:56:57.705239 systemd[1]: Mounted sys-fs-fuse-connections.mount - FUSE Control File System. Jan 16 17:56:57.708968 systemd[1]: Mounted sys-kernel-config.mount - Kernel Configuration File System. Jan 16 17:56:57.716618 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 16 17:56:57.721085 systemd[1]: Starting modprobe@dm_mod.service - Load Kernel Module dm_mod... Jan 16 17:56:57.722869 systemd[1]: Starting modprobe@efi_pstore.service - Load Kernel Module efi_pstore... Jan 16 17:56:57.730508 systemd[1]: Starting modprobe@loop.service - Load Kernel Module loop... Jan 16 17:56:57.732740 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 16 17:56:57.732991 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 16 17:56:57.733117 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 16 17:56:57.736439 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 16 17:56:57.737796 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 16 17:56:57.737945 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 16 17:56:57.738040 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 16 17:56:57.740786 systemd[1]: ignition-delete-config.service - Ignition (delete config) was skipped because no trigger condition checks were met. Jan 16 17:56:57.751523 systemd[1]: Starting modprobe@drm.service - Load Kernel Module drm... Jan 16 17:56:57.753002 systemd[1]: systemd-binfmt.service - Set Up Additional Binary Formats was skipped because no trigger condition checks were met. Jan 16 17:56:57.753195 systemd[1]: systemd-confext.service - Merge System Configuration Images into /etc/ was skipped because no trigger condition checks were met. Jan 16 17:56:57.753283 systemd[1]: systemd-hibernate-clear.service - Clear Stale Hibernate Storage Info was skipped because of an unmet condition check (ConditionPathExists=/sys/firmware/efi/efivars/HibernateLocation-8cf2644b-4b0b-428f-9387-6d876050dc67). Jan 16 17:56:57.755000 audit[1427]: SYSTEM_BOOT pid=1427 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg=' comm="systemd-update-utmp" exe="/usr/lib/systemd/systemd-update-utmp" hostname=? addr=? terminal=? res=success' Jan 16 17:56:57.760000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=ensure-sysext comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:57.759138 systemd[1]: Finished ensure-sysext.service. Jan 16 17:56:57.768000 audit: BPF prog-id=56 op=LOAD Jan 16 17:56:57.770651 systemd[1]: Starting systemd-timesyncd.service - Network Time Synchronization... Jan 16 17:56:57.781623 systemd[1]: Finished systemd-update-utmp.service - Record System Boot/Shutdown in UTMP. Jan 16 17:56:57.782000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-update-utmp comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:57.812472 systemd-udevd[1423]: Using default interface naming scheme 'v257'. Jan 16 17:56:57.813215 systemd[1]: Finished systemd-journal-catalog-update.service - Rebuild Journal Catalog. Jan 16 17:56:57.814000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=systemd-journal-catalog-update comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:57.815209 systemd[1]: modprobe@dm_mod.service: Deactivated successfully. Jan 16 17:56:57.816139 systemd[1]: Finished modprobe@dm_mod.service - Load Kernel Module dm_mod. Jan 16 17:56:57.817000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:57.817000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@dm_mod comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:57.823000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:57.823000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@loop comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:57.825000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=clean-ca-certificates comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:57.821120 systemd[1]: modprobe@loop.service: Deactivated successfully. Jan 16 17:56:57.821414 systemd[1]: Finished modprobe@loop.service - Load Kernel Module loop. Jan 16 17:56:57.825624 systemd[1]: Finished clean-ca-certificates.service - Clean up broken links in /etc/ssl/certs. Jan 16 17:56:57.827623 systemd[1]: systemd-repart.service - Repartition Root Disk was skipped because no trigger condition checks were met. Jan 16 17:56:57.827661 systemd[1]: update-ca-certificates.service - Update CA bundle at /etc/ssl/certs/ca-certificates.crt was skipped because of an unmet condition check (ConditionPathIsSymbolicLink=!/etc/ssl/certs/ca-certificates.crt). Jan 16 17:56:57.829983 systemd[1]: modprobe@efi_pstore.service: Deactivated successfully. Jan 16 17:56:57.830407 systemd[1]: Finished modprobe@efi_pstore.service - Load Kernel Module efi_pstore. Jan 16 17:56:57.831000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:57.831000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@efi_pstore comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:57.832969 systemd[1]: modprobe@drm.service: Deactivated successfully. Jan 16 17:56:57.834731 systemd[1]: Finished modprobe@drm.service - Load Kernel Module drm. Jan 16 17:56:57.837000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:57.837000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=modprobe@drm comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:56:57.842771 systemd[1]: systemd-pstore.service - Platform Persistent Storage Archival was skipped because of an unmet condition check (ConditionDirectoryNotEmpty=/sys/fs/pstore). Jan 16 17:56:57.850000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=add_rule key=(null) list=5 res=1 Jan 16 17:56:57.850000 audit[1460]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=fffff7a3c290 a2=420 a3=0 items=0 ppid=1418 pid=1460 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:56:57.850000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 16 17:56:57.851968 augenrules[1460]: No rules Jan 16 17:56:57.853243 systemd[1]: audit-rules.service: Deactivated successfully. Jan 16 17:56:57.854795 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 16 17:56:57.887106 systemd[1]: Started systemd-timesyncd.service - Network Time Synchronization. Jan 16 17:56:57.888267 systemd[1]: Reached target time-set.target - System Time Set. Jan 16 17:56:57.898199 systemd[1]: Started systemd-udevd.service - Rule-based Manager for Device Events and Files. Jan 16 17:56:57.903351 systemd[1]: Starting systemd-networkd.service - Network Configuration... Jan 16 17:56:58.009484 systemd-networkd[1470]: lo: Link UP Jan 16 17:56:58.009495 systemd-networkd[1470]: lo: Gained carrier Jan 16 17:56:58.011041 systemd[1]: Started systemd-networkd.service - Network Configuration. Jan 16 17:56:58.013265 systemd[1]: Reached target network.target - Network. Jan 16 17:56:58.017157 systemd[1]: Starting systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd... Jan 16 17:56:58.021411 systemd[1]: Starting systemd-networkd-wait-online.service - Wait for Network to be Configured... Jan 16 17:56:58.022703 systemd[1]: Condition check resulted in dev-ttyAMA0.device - /dev/ttyAMA0 being skipped. Jan 16 17:56:58.059309 systemd[1]: Finished systemd-networkd-persistent-storage.service - Enable Persistent Storage in systemd-networkd. Jan 16 17:56:58.076207 systemd-networkd[1470]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 16 17:56:58.076219 systemd-networkd[1470]: eth1: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 16 17:56:58.076937 systemd-networkd[1470]: eth1: Link UP Jan 16 17:56:58.077636 systemd-networkd[1470]: eth1: Gained carrier Jan 16 17:56:58.077657 systemd-networkd[1470]: eth1: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 16 17:56:58.125632 systemd-networkd[1470]: eth1: DHCPv4 address 10.0.0.3/32 acquired from 10.0.0.1 Jan 16 17:56:58.126307 systemd-timesyncd[1440]: Network configuration changed, trying to establish connection. Jan 16 17:56:58.129278 systemd-networkd[1470]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 16 17:56:58.129290 systemd-networkd[1470]: eth0: Configuring with /usr/lib/systemd/network/zz-default.network. Jan 16 17:56:58.130236 systemd-timesyncd[1440]: Network configuration changed, trying to establish connection. Jan 16 17:56:58.130427 systemd-networkd[1470]: eth0: Link UP Jan 16 17:56:58.130625 systemd-timesyncd[1440]: Network configuration changed, trying to establish connection. Jan 16 17:56:58.131290 systemd-networkd[1470]: eth0: Gained carrier Jan 16 17:56:58.131311 systemd-networkd[1470]: eth0: Found matching .network file, based on potentially unpredictable interface name: /usr/lib/systemd/network/zz-default.network Jan 16 17:56:58.136353 systemd-timesyncd[1440]: Network configuration changed, trying to establish connection. Jan 16 17:56:58.163746 ldconfig[1420]: /sbin/ldconfig: /usr/lib/ld.so.conf is not an ELF file - it has the wrong magic bytes at the start. Jan 16 17:56:58.164571 kernel: mousedev: PS/2 mouse device common for all mice Jan 16 17:56:58.172062 systemd[1]: Finished ldconfig.service - Rebuild Dynamic Linker Cache. Jan 16 17:56:58.177211 systemd[1]: Starting systemd-update-done.service - Update is Completed... Jan 16 17:56:58.181669 systemd-networkd[1470]: eth0: DHCPv4 address 49.12.189.56/32, gateway 172.31.1.1 acquired from 172.31.1.1 Jan 16 17:56:58.182656 systemd-timesyncd[1440]: Network configuration changed, trying to establish connection. Jan 16 17:56:58.208228 systemd[1]: Finished systemd-update-done.service - Update is Completed. Jan 16 17:56:58.222020 systemd[1]: Reached target sysinit.target - System Initialization. Jan 16 17:56:58.223166 systemd[1]: Started motdgen.path - Watch for update engine configuration changes. Jan 16 17:56:58.224383 systemd[1]: Started user-cloudinit@var-lib-flatcar\x2dinstall-user_data.path - Watch for a cloud-config at /var/lib/flatcar-install/user_data. Jan 16 17:56:58.225748 systemd[1]: Started logrotate.timer - Daily rotation of log files. Jan 16 17:56:58.226914 systemd[1]: Started mdadm.timer - Weekly check for MD array's redundancy information.. Jan 16 17:56:58.228059 systemd[1]: Started systemd-sysupdate-reboot.timer - Reboot Automatically After System Update. Jan 16 17:56:58.229272 systemd[1]: Started systemd-sysupdate.timer - Automatic System Update. Jan 16 17:56:58.230451 systemd[1]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of Temporary Directories. Jan 16 17:56:58.231749 systemd[1]: update-engine-stub.timer - Update Engine Stub Timer was skipped because of an unmet condition check (ConditionPathExists=/usr/.noupdate). Jan 16 17:56:58.231782 systemd[1]: Reached target paths.target - Path Units. Jan 16 17:56:58.232818 systemd[1]: Reached target timers.target - Timer Units. Jan 16 17:56:58.234429 systemd[1]: Listening on dbus.socket - D-Bus System Message Bus Socket. Jan 16 17:56:58.237455 systemd[1]: Starting docker.socket - Docker Socket for the API... Jan 16 17:56:58.243432 systemd[1]: Listening on sshd-unix-local.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_UNIX Local). Jan 16 17:56:58.245087 systemd[1]: Listening on sshd-vsock.socket - OpenSSH Server Socket (systemd-ssh-generator, AF_VSOCK). Jan 16 17:56:58.246614 systemd[1]: Reached target ssh-access.target - SSH Access Available. Jan 16 17:56:58.251628 systemd[1]: Listening on sshd.socket - OpenSSH Server Socket. Jan 16 17:56:58.254372 systemd[1]: Listening on systemd-hostnamed.socket - Hostname Service Socket. Jan 16 17:56:58.256692 systemd[1]: Listening on docker.socket - Docker Socket for the API. Jan 16 17:56:58.263290 systemd[1]: Found device dev-disk-by\x2dlabel-OEM.device - QEMU_HARDDISK OEM. Jan 16 17:56:58.267710 systemd[1]: Reached target sockets.target - Socket Units. Jan 16 17:56:58.269631 systemd[1]: Reached target basic.target - Basic System. Jan 16 17:56:58.270639 systemd[1]: addon-config@oem.service - Configure Addon /oem was skipped because no trigger condition checks were met. Jan 16 17:56:58.270669 systemd[1]: addon-run@oem.service - Run Addon /oem was skipped because no trigger condition checks were met. Jan 16 17:56:58.272989 kernel: [drm] pci: virtio-gpu-pci detected at 0000:00:01.0 Jan 16 17:56:58.273052 kernel: [drm] features: -virgl +edid -resource_blob -host_visible Jan 16 17:56:58.273068 kernel: [drm] features: -context_init Jan 16 17:56:58.273433 systemd[1]: Starting containerd.service - containerd container runtime... Jan 16 17:56:58.275973 kernel: [drm] number of scanouts: 1 Jan 16 17:56:58.276032 kernel: [drm] number of cap sets: 0 Jan 16 17:56:58.277655 kernel: [drm] Initialized virtio_gpu 0.1.0 for 0000:00:01.0 on minor 0 Jan 16 17:56:58.279719 systemd[1]: Starting coreos-metadata.service - Flatcar Metadata Agent... Jan 16 17:56:58.287591 kernel: Console: switching to colour frame buffer device 160x50 Jan 16 17:56:58.293991 systemd[1]: Starting dbus.service - D-Bus System Message Bus... Jan 16 17:56:58.294674 kernel: virtio-pci 0000:00:01.0: [drm] fb0: virtio_gpudrmfb frame buffer device Jan 16 17:56:58.299805 systemd[1]: Starting dracut-shutdown.service - Restore /run/initramfs on shutdown... Jan 16 17:56:58.305002 systemd[1]: Starting enable-oem-cloudinit.service - Enable cloudinit... Jan 16 17:56:58.311686 systemd[1]: Starting extend-filesystems.service - Extend Filesystems... Jan 16 17:56:58.313289 systemd[1]: flatcar-setup-environment.service - Modifies /etc/environment for CoreOS was skipped because of an unmet condition check (ConditionPathExists=/oem/bin/flatcar-setup-environment). Jan 16 17:56:58.317759 systemd[1]: Starting motdgen.service - Generate /run/flatcar/motd... Jan 16 17:56:58.320519 jq[1523]: false Jan 16 17:56:58.320926 systemd[1]: Starting prepare-helm.service - Unpack helm to /opt/bin... Jan 16 17:56:58.323519 systemd[1]: Starting ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline... Jan 16 17:56:58.328786 systemd[1]: Starting sshd-keygen.service - Generate sshd host keys... Jan 16 17:56:58.331295 systemd[1]: Starting systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM... Jan 16 17:56:58.339144 systemd[1]: Starting systemd-logind.service - User Login Management... Jan 16 17:56:58.340071 systemd[1]: tcsd.service - TCG Core Services Daemon was skipped because of an unmet condition check (ConditionPathExists=/dev/tpm0). Jan 16 17:56:58.340489 systemd[1]: cgroup compatibility translation between legacy and unified hierarchy settings activated. See cgroup-compat debug messages for details. Jan 16 17:56:58.343255 systemd[1]: Starting update-engine.service - Update Engine... Jan 16 17:56:58.348188 systemd[1]: Starting update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition... Jan 16 17:56:58.353519 systemd[1]: Finished dracut-shutdown.service - Restore /run/initramfs on shutdown. Jan 16 17:56:58.356575 systemd[1]: enable-oem-cloudinit.service: Skipped due to 'exec-condition'. Jan 16 17:56:58.357707 systemd[1]: Condition check resulted in enable-oem-cloudinit.service - Enable cloudinit being skipped. Jan 16 17:56:58.365138 systemd[1]: ssh-key-proc-cmdline.service: Deactivated successfully. Jan 16 17:56:58.366661 systemd[1]: Finished ssh-key-proc-cmdline.service - Install an ssh key from /proc/cmdline. Jan 16 17:56:58.384155 systemd[1]: Finished systemd-fsck@dev-disk-by\x2dlabel-OEM.service - File System Check on /dev/disk/by-label/OEM. Jan 16 17:56:58.390176 jq[1534]: true Jan 16 17:56:58.407618 jq[1551]: true Jan 16 17:56:58.410043 coreos-metadata[1520]: Jan 16 17:56:58.408 INFO Fetching http://169.254.169.254/hetzner/v1/metadata: Attempt #1 Jan 16 17:56:58.410757 coreos-metadata[1520]: Jan 16 17:56:58.410 INFO Fetch successful Jan 16 17:56:58.411518 coreos-metadata[1520]: Jan 16 17:56:58.411 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/private-networks: Attempt #1 Jan 16 17:56:58.412571 extend-filesystems[1524]: Found /dev/sda6 Jan 16 17:56:58.416268 coreos-metadata[1520]: Jan 16 17:56:58.415 INFO Fetch successful Jan 16 17:56:58.429716 extend-filesystems[1524]: Found /dev/sda9 Jan 16 17:56:58.439571 extend-filesystems[1524]: Checking size of /dev/sda9 Jan 16 17:56:58.453356 tar[1536]: linux-arm64/LICENSE Jan 16 17:56:58.453356 tar[1536]: linux-arm64/helm Jan 16 17:56:58.456202 systemd[1]: motdgen.service: Deactivated successfully. Jan 16 17:56:58.456485 systemd[1]: Finished motdgen.service - Generate /run/flatcar/motd. Jan 16 17:56:58.464480 extend-filesystems[1524]: Resized partition /dev/sda9 Jan 16 17:56:58.471062 extend-filesystems[1579]: resize2fs 1.47.3 (8-Jul-2025) Jan 16 17:56:58.472539 dbus-daemon[1521]: [system] SELinux support is enabled Jan 16 17:56:58.473162 systemd[1]: Started dbus.service - D-Bus System Message Bus. Jan 16 17:56:58.479714 systemd[1]: system-cloudinit@usr-share-oem-cloud\x2dconfig.yml.service - Load cloud-config from /usr/share/oem/cloud-config.yml was skipped because of an unmet condition check (ConditionFileNotEmpty=/usr/share/oem/cloud-config.yml). Jan 16 17:56:58.484736 update_engine[1533]: I20260116 17:56:58.481644 1533 main.cc:92] Flatcar Update Engine starting Jan 16 17:56:58.479747 systemd[1]: Reached target system-config.target - Load system-provided cloud configs. Jan 16 17:56:58.480533 systemd[1]: user-cloudinit-proc-cmdline.service - Load cloud-config from url defined in /proc/cmdline was skipped because of an unmet condition check (ConditionKernelCommandLine=cloud-config-url). Jan 16 17:56:58.480571 systemd[1]: Reached target user-config.target - Load user-provided cloud configs. Jan 16 17:56:58.486598 kernel: EXT4-fs (sda9): resizing filesystem from 1617920 to 8410107 blocks Jan 16 17:56:58.496787 systemd[1]: Condition check resulted in dev-virtio\x2dports-org.qemu.guest_agent.0.device - /dev/virtio-ports/org.qemu.guest_agent.0 being skipped. Jan 16 17:56:58.503256 systemd[1]: Started qemu-guest-agent.service - QEMU Guest Agent. Jan 16 17:56:58.511959 update_engine[1533]: I20260116 17:56:58.511904 1533 update_check_scheduler.cc:74] Next update check in 11m56s Jan 16 17:56:58.514932 systemd[1]: Started update-engine.service - Update Engine. Jan 16 17:56:58.531246 systemd[1]: Started locksmithd.service - Cluster reboot manager. Jan 16 17:56:58.577313 systemd-logind[1532]: New seat seat0. Jan 16 17:56:58.586478 systemd[1]: Started systemd-logind.service - User Login Management. Jan 16 17:56:58.602520 bash[1597]: Updated "/home/core/.ssh/authorized_keys" Jan 16 17:56:58.606654 systemd[1]: Finished update-ssh-keys-after-ignition.service - Run update-ssh-keys once after Ignition. Jan 16 17:56:58.616891 systemd[1]: Starting sshkeys.service... Jan 16 17:56:58.625472 systemd[1]: Finished coreos-metadata.service - Flatcar Metadata Agent. Jan 16 17:56:58.626902 systemd[1]: packet-phone-home.service - Report Success to Packet was skipped because no trigger condition checks were met. Jan 16 17:56:58.654588 kernel: EXT4-fs (sda9): resized filesystem to 8410107 Jan 16 17:56:58.668480 extend-filesystems[1579]: Filesystem at /dev/sda9 is mounted on /; on-line resizing required Jan 16 17:56:58.668480 extend-filesystems[1579]: old_desc_blocks = 1, new_desc_blocks = 5 Jan 16 17:56:58.668480 extend-filesystems[1579]: The filesystem on /dev/sda9 is now 8410107 (4k) blocks long. Jan 16 17:56:58.672842 extend-filesystems[1524]: Resized filesystem in /dev/sda9 Jan 16 17:56:58.670726 systemd[1]: extend-filesystems.service: Deactivated successfully. Jan 16 17:56:58.673609 systemd[1]: Finished extend-filesystems.service - Extend Filesystems. Jan 16 17:56:58.683728 systemd[1]: Created slice system-coreos\x2dmetadata\x2dsshkeys.slice - Slice /system/coreos-metadata-sshkeys. Jan 16 17:56:58.690635 systemd[1]: Starting coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys)... Jan 16 17:56:58.825096 systemd-logind[1532]: Watching system buttons on /dev/input/event0 (Power Button) Jan 16 17:56:58.868670 containerd[1547]: time="2026-01-16T17:56:58Z" level=warning msg="Ignoring unknown key in TOML" column=1 error="strict mode: fields in the document are missing in the target struct" file=/usr/share/containerd/config.toml key=subreaper row=8 Jan 16 17:56:58.869759 containerd[1547]: time="2026-01-16T17:56:58.869724400Z" level=info msg="starting containerd" revision=fcd43222d6b07379a4be9786bda52438f0dd16a1 version=v2.1.5 Jan 16 17:56:58.877071 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 16 17:56:58.897662 coreos-metadata[1626]: Jan 16 17:56:58.895 INFO Fetching http://169.254.169.254/hetzner/v1/metadata/public-keys: Attempt #1 Jan 16 17:56:58.898520 coreos-metadata[1626]: Jan 16 17:56:58.898 INFO Fetch successful Jan 16 17:56:58.900470 containerd[1547]: time="2026-01-16T17:56:58.900409200Z" level=warning msg="Configuration migrated from version 2, use `containerd config migrate` to avoid migration" t="10.92µs" Jan 16 17:56:58.900470 containerd[1547]: time="2026-01-16T17:56:58.900460720Z" level=info msg="loading plugin" id=io.containerd.content.v1.content type=io.containerd.content.v1 Jan 16 17:56:58.900536 containerd[1547]: time="2026-01-16T17:56:58.900513120Z" level=info msg="loading plugin" id=io.containerd.image-verifier.v1.bindir type=io.containerd.image-verifier.v1 Jan 16 17:56:58.900536 containerd[1547]: time="2026-01-16T17:56:58.900525920Z" level=info msg="loading plugin" id=io.containerd.internal.v1.opt type=io.containerd.internal.v1 Jan 16 17:56:58.900729 containerd[1547]: time="2026-01-16T17:56:58.900703840Z" level=info msg="loading plugin" id=io.containerd.warning.v1.deprecations type=io.containerd.warning.v1 Jan 16 17:56:58.900757 containerd[1547]: time="2026-01-16T17:56:58.900734600Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 16 17:56:58.901281 containerd[1547]: time="2026-01-16T17:56:58.901243800Z" level=info msg="skip loading plugin" error="no scratch file generator: skip plugin" id=io.containerd.snapshotter.v1.blockfile type=io.containerd.snapshotter.v1 Jan 16 17:56:58.901281 containerd[1547]: time="2026-01-16T17:56:58.901270120Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 16 17:56:58.902064 unknown[1626]: wrote ssh authorized keys file for user: core Jan 16 17:56:58.902277 containerd[1547]: time="2026-01-16T17:56:58.902238560Z" level=info msg="skip loading plugin" error="path /var/lib/containerd/io.containerd.snapshotter.v1.btrfs (ext4) must be a btrfs filesystem to be used with the btrfs snapshotter: skip plugin" id=io.containerd.snapshotter.v1.btrfs type=io.containerd.snapshotter.v1 Jan 16 17:56:58.902569 containerd[1547]: time="2026-01-16T17:56:58.902260600Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 16 17:56:58.902569 containerd[1547]: time="2026-01-16T17:56:58.902333280Z" level=info msg="skip loading plugin" error="devmapper not configured: skip plugin" id=io.containerd.snapshotter.v1.devmapper type=io.containerd.snapshotter.v1 Jan 16 17:56:58.902569 containerd[1547]: time="2026-01-16T17:56:58.902344440Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 16 17:56:58.902806 containerd[1547]: time="2026-01-16T17:56:58.902649240Z" level=info msg="skip loading plugin" error="EROFS unsupported, please `modprobe erofs`: skip plugin" id=io.containerd.snapshotter.v1.erofs type=io.containerd.snapshotter.v1 Jan 16 17:56:58.904483 containerd[1547]: time="2026-01-16T17:56:58.902686200Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.native type=io.containerd.snapshotter.v1 Jan 16 17:56:58.904483 containerd[1547]: time="2026-01-16T17:56:58.903840640Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.overlayfs type=io.containerd.snapshotter.v1 Jan 16 17:56:58.904483 containerd[1547]: time="2026-01-16T17:56:58.904066400Z" level=info msg="loading plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 16 17:56:58.904483 containerd[1547]: time="2026-01-16T17:56:58.904095600Z" level=info msg="skip loading plugin" error="lstat /var/lib/containerd/io.containerd.snapshotter.v1.zfs: no such file or directory: skip plugin" id=io.containerd.snapshotter.v1.zfs type=io.containerd.snapshotter.v1 Jan 16 17:56:58.904483 containerd[1547]: time="2026-01-16T17:56:58.904106680Z" level=info msg="loading plugin" id=io.containerd.event.v1.exchange type=io.containerd.event.v1 Jan 16 17:56:58.904483 containerd[1547]: time="2026-01-16T17:56:58.904148280Z" level=info msg="loading plugin" id=io.containerd.monitor.task.v1.cgroups type=io.containerd.monitor.task.v1 Jan 16 17:56:58.904483 containerd[1547]: time="2026-01-16T17:56:58.904397080Z" level=info msg="loading plugin" id=io.containerd.metadata.v1.bolt type=io.containerd.metadata.v1 Jan 16 17:56:58.905492 containerd[1547]: time="2026-01-16T17:56:58.904918760Z" level=info msg="metadata content store policy set" policy=shared Jan 16 17:56:58.916912 containerd[1547]: time="2026-01-16T17:56:58.916837440Z" level=info msg="loading plugin" id=io.containerd.gc.v1.scheduler type=io.containerd.gc.v1 Jan 16 17:56:58.916912 containerd[1547]: time="2026-01-16T17:56:58.916931480Z" level=info msg="loading plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 16 17:56:58.917071 containerd[1547]: time="2026-01-16T17:56:58.917018320Z" level=info msg="skip loading plugin" error="could not find mkfs.erofs: exec: \"mkfs.erofs\": executable file not found in $PATH: skip plugin" id=io.containerd.differ.v1.erofs type=io.containerd.differ.v1 Jan 16 17:56:58.917071 containerd[1547]: time="2026-01-16T17:56:58.917043280Z" level=info msg="loading plugin" id=io.containerd.differ.v1.walking type=io.containerd.differ.v1 Jan 16 17:56:58.917071 containerd[1547]: time="2026-01-16T17:56:58.917057240Z" level=info msg="loading plugin" id=io.containerd.lease.v1.manager type=io.containerd.lease.v1 Jan 16 17:56:58.917071 containerd[1547]: time="2026-01-16T17:56:58.917071240Z" level=info msg="loading plugin" id=io.containerd.service.v1.containers-service type=io.containerd.service.v1 Jan 16 17:56:58.917162 containerd[1547]: time="2026-01-16T17:56:58.917083000Z" level=info msg="loading plugin" id=io.containerd.service.v1.content-service type=io.containerd.service.v1 Jan 16 17:56:58.917162 containerd[1547]: time="2026-01-16T17:56:58.917092440Z" level=info msg="loading plugin" id=io.containerd.service.v1.diff-service type=io.containerd.service.v1 Jan 16 17:56:58.917162 containerd[1547]: time="2026-01-16T17:56:58.917103840Z" level=info msg="loading plugin" id=io.containerd.service.v1.images-service type=io.containerd.service.v1 Jan 16 17:56:58.917162 containerd[1547]: time="2026-01-16T17:56:58.917115920Z" level=info msg="loading plugin" id=io.containerd.service.v1.introspection-service type=io.containerd.service.v1 Jan 16 17:56:58.917162 containerd[1547]: time="2026-01-16T17:56:58.917132160Z" level=info msg="loading plugin" id=io.containerd.service.v1.namespaces-service type=io.containerd.service.v1 Jan 16 17:56:58.917162 containerd[1547]: time="2026-01-16T17:56:58.917145400Z" level=info msg="loading plugin" id=io.containerd.service.v1.snapshots-service type=io.containerd.service.v1 Jan 16 17:56:58.917251 containerd[1547]: time="2026-01-16T17:56:58.917162480Z" level=info msg="loading plugin" id=io.containerd.shim.v1.manager type=io.containerd.shim.v1 Jan 16 17:56:58.917251 containerd[1547]: time="2026-01-16T17:56:58.917180520Z" level=info msg="loading plugin" id=io.containerd.runtime.v2.task type=io.containerd.runtime.v2 Jan 16 17:56:58.917574 containerd[1547]: time="2026-01-16T17:56:58.917299120Z" level=info msg="loading plugin" id=io.containerd.service.v1.tasks-service type=io.containerd.service.v1 Jan 16 17:56:58.917574 containerd[1547]: time="2026-01-16T17:56:58.917327960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.containers type=io.containerd.grpc.v1 Jan 16 17:56:58.917574 containerd[1547]: time="2026-01-16T17:56:58.917350160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.content type=io.containerd.grpc.v1 Jan 16 17:56:58.917574 containerd[1547]: time="2026-01-16T17:56:58.917361400Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.diff type=io.containerd.grpc.v1 Jan 16 17:56:58.917574 containerd[1547]: time="2026-01-16T17:56:58.917371080Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.events type=io.containerd.grpc.v1 Jan 16 17:56:58.917574 containerd[1547]: time="2026-01-16T17:56:58.917380680Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.images type=io.containerd.grpc.v1 Jan 16 17:56:58.917574 containerd[1547]: time="2026-01-16T17:56:58.917391640Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.introspection type=io.containerd.grpc.v1 Jan 16 17:56:58.917574 containerd[1547]: time="2026-01-16T17:56:58.917403160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.leases type=io.containerd.grpc.v1 Jan 16 17:56:58.917574 containerd[1547]: time="2026-01-16T17:56:58.917416840Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.namespaces type=io.containerd.grpc.v1 Jan 16 17:56:58.917574 containerd[1547]: time="2026-01-16T17:56:58.917427800Z" level=info msg="loading plugin" id=io.containerd.sandbox.store.v1.local type=io.containerd.sandbox.store.v1 Jan 16 17:56:58.917574 containerd[1547]: time="2026-01-16T17:56:58.917439560Z" level=info msg="loading plugin" id=io.containerd.transfer.v1.local type=io.containerd.transfer.v1 Jan 16 17:56:58.917574 containerd[1547]: time="2026-01-16T17:56:58.917465600Z" level=info msg="loading plugin" id=io.containerd.cri.v1.images type=io.containerd.cri.v1 Jan 16 17:56:58.917574 containerd[1547]: time="2026-01-16T17:56:58.917503360Z" level=info msg="Get image filesystem path \"/var/lib/containerd/io.containerd.snapshotter.v1.overlayfs\" for snapshotter \"overlayfs\"" Jan 16 17:56:58.917574 containerd[1547]: time="2026-01-16T17:56:58.917517600Z" level=info msg="Start snapshots syncer" Jan 16 17:56:58.917574 containerd[1547]: time="2026-01-16T17:56:58.917560080Z" level=info msg="loading plugin" id=io.containerd.cri.v1.runtime type=io.containerd.cri.v1 Jan 16 17:56:58.918853 containerd[1547]: time="2026-01-16T17:56:58.917796000Z" level=info msg="starting cri plugin" config="{\"containerd\":{\"defaultRuntimeName\":\"runc\",\"runtimes\":{\"runc\":{\"runtimeType\":\"io.containerd.runc.v2\",\"runtimePath\":\"\",\"PodAnnotations\":null,\"ContainerAnnotations\":null,\"options\":{\"BinaryName\":\"\",\"CriuImagePath\":\"\",\"CriuWorkPath\":\"\",\"IoGid\":0,\"IoUid\":0,\"NoNewKeyring\":false,\"Root\":\"\",\"ShimCgroup\":\"\",\"SystemdCgroup\":true},\"privileged_without_host_devices\":false,\"privileged_without_host_devices_all_devices_allowed\":false,\"cgroupWritable\":false,\"baseRuntimeSpec\":\"\",\"cniConfDir\":\"\",\"cniMaxConfNum\":0,\"snapshotter\":\"\",\"sandboxer\":\"podsandbox\",\"io_type\":\"\"}},\"ignoreBlockIONotEnabledErrors\":false,\"ignoreRdtNotEnabledErrors\":false},\"cni\":{\"binDir\":\"\",\"binDirs\":[\"/opt/cni/bin\"],\"confDir\":\"/etc/cni/net.d\",\"maxConfNum\":1,\"setupSerially\":false,\"confTemplate\":\"\",\"ipPref\":\"\",\"useInternalLoopback\":false},\"enableSelinux\":true,\"selinuxCategoryRange\":1024,\"maxContainerLogLineSize\":16384,\"disableApparmor\":false,\"restrictOOMScoreAdj\":false,\"disableProcMount\":false,\"unsetSeccompProfile\":\"\",\"tolerateMissingHugetlbController\":true,\"disableHugetlbController\":true,\"device_ownership_from_security_context\":false,\"ignoreImageDefinedVolumes\":false,\"netnsMountsUnderStateDir\":false,\"enableUnprivilegedPorts\":true,\"enableUnprivilegedICMP\":true,\"enableCDI\":true,\"cdiSpecDirs\":[\"/etc/cdi\",\"/var/run/cdi\"],\"drainExecSyncIOTimeout\":\"0s\",\"ignoreDeprecationWarnings\":null,\"containerdRootDir\":\"/var/lib/containerd\",\"containerdEndpoint\":\"/run/containerd/containerd.sock\",\"rootDir\":\"/var/lib/containerd/io.containerd.grpc.v1.cri\",\"stateDir\":\"/run/containerd/io.containerd.grpc.v1.cri\"}" Jan 16 17:56:58.918853 containerd[1547]: time="2026-01-16T17:56:58.917844000Z" level=info msg="loading plugin" id=io.containerd.podsandbox.controller.v1.podsandbox type=io.containerd.podsandbox.controller.v1 Jan 16 17:56:58.919052 containerd[1547]: time="2026-01-16T17:56:58.917948480Z" level=info msg="loading plugin" id=io.containerd.sandbox.controller.v1.shim type=io.containerd.sandbox.controller.v1 Jan 16 17:56:58.919052 containerd[1547]: time="2026-01-16T17:56:58.918049960Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandbox-controllers type=io.containerd.grpc.v1 Jan 16 17:56:58.919052 containerd[1547]: time="2026-01-16T17:56:58.918072160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.sandboxes type=io.containerd.grpc.v1 Jan 16 17:56:58.919052 containerd[1547]: time="2026-01-16T17:56:58.918082600Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.snapshots type=io.containerd.grpc.v1 Jan 16 17:56:58.919052 containerd[1547]: time="2026-01-16T17:56:58.918100120Z" level=info msg="loading plugin" id=io.containerd.streaming.v1.manager type=io.containerd.streaming.v1 Jan 16 17:56:58.919052 containerd[1547]: time="2026-01-16T17:56:58.918112120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.streaming type=io.containerd.grpc.v1 Jan 16 17:56:58.919052 containerd[1547]: time="2026-01-16T17:56:58.918124120Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.tasks type=io.containerd.grpc.v1 Jan 16 17:56:58.919052 containerd[1547]: time="2026-01-16T17:56:58.918135200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.transfer type=io.containerd.grpc.v1 Jan 16 17:56:58.919052 containerd[1547]: time="2026-01-16T17:56:58.918146720Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.version type=io.containerd.grpc.v1 Jan 16 17:56:58.919052 containerd[1547]: time="2026-01-16T17:56:58.918158400Z" level=info msg="loading plugin" id=io.containerd.monitor.container.v1.restart type=io.containerd.monitor.container.v1 Jan 16 17:56:58.919052 containerd[1547]: time="2026-01-16T17:56:58.918191280Z" level=info msg="loading plugin" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 16 17:56:58.919052 containerd[1547]: time="2026-01-16T17:56:58.918206160Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.tracing.processor.v1.otlp type=io.containerd.tracing.processor.v1 Jan 16 17:56:58.919052 containerd[1547]: time="2026-01-16T17:56:58.918215040Z" level=info msg="loading plugin" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 16 17:56:58.919261 containerd[1547]: time="2026-01-16T17:56:58.918224160Z" level=info msg="skip loading plugin" error="skip plugin: tracing endpoint not configured" id=io.containerd.internal.v1.tracing type=io.containerd.internal.v1 Jan 16 17:56:58.919261 containerd[1547]: time="2026-01-16T17:56:58.918231720Z" level=info msg="loading plugin" id=io.containerd.ttrpc.v1.otelttrpc type=io.containerd.ttrpc.v1 Jan 16 17:56:58.919261 containerd[1547]: time="2026-01-16T17:56:58.918242200Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.healthcheck type=io.containerd.grpc.v1 Jan 16 17:56:58.919261 containerd[1547]: time="2026-01-16T17:56:58.918252360Z" level=info msg="loading plugin" id=io.containerd.nri.v1.nri type=io.containerd.nri.v1 Jan 16 17:56:58.919261 containerd[1547]: time="2026-01-16T17:56:58.918369080Z" level=info msg="runtime interface created" Jan 16 17:56:58.919261 containerd[1547]: time="2026-01-16T17:56:58.918375080Z" level=info msg="created NRI interface" Jan 16 17:56:58.919261 containerd[1547]: time="2026-01-16T17:56:58.918388160Z" level=info msg="loading plugin" id=io.containerd.grpc.v1.cri type=io.containerd.grpc.v1 Jan 16 17:56:58.919261 containerd[1547]: time="2026-01-16T17:56:58.918402520Z" level=info msg="Connect containerd service" Jan 16 17:56:58.919261 containerd[1547]: time="2026-01-16T17:56:58.918430200Z" level=info msg="using experimental NRI integration - disable nri plugin to prevent this" Jan 16 17:56:58.925562 containerd[1547]: time="2026-01-16T17:56:58.923256600Z" level=error msg="failed to load cni during init, please check CRI plugin status before setting up network for pods" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 16 17:56:58.943376 systemd-logind[1532]: Watching system buttons on /dev/input/event2 (QEMU QEMU USB Keyboard) Jan 16 17:56:58.954195 update-ssh-keys[1639]: Updated "/home/core/.ssh/authorized_keys" Jan 16 17:56:58.955598 systemd[1]: Finished coreos-metadata-sshkeys@core.service - Flatcar Metadata Agent (SSH Keys). Jan 16 17:56:58.959633 systemd[1]: Finished sshkeys.service. Jan 16 17:56:58.974214 systemd[1]: systemd-vconsole-setup.service: Deactivated successfully. Jan 16 17:56:58.976662 systemd[1]: Stopped systemd-vconsole-setup.service - Virtual Console Setup. Jan 16 17:56:58.990638 systemd[1]: Starting systemd-vconsole-setup.service - Virtual Console Setup... Jan 16 17:56:59.004445 locksmithd[1598]: locksmithd starting currentOperation="UPDATE_STATUS_IDLE" strategy="reboot" Jan 16 17:56:59.088834 systemd[1]: Finished systemd-vconsole-setup.service - Virtual Console Setup. Jan 16 17:56:59.145796 containerd[1547]: time="2026-01-16T17:56:59.145747600Z" level=info msg="Start subscribing containerd event" Jan 16 17:56:59.145942 containerd[1547]: time="2026-01-16T17:56:59.145927840Z" level=info msg="Start recovering state" Jan 16 17:56:59.146109 containerd[1547]: time="2026-01-16T17:56:59.146079440Z" level=info msg=serving... address=/run/containerd/containerd.sock.ttrpc Jan 16 17:56:59.146151 containerd[1547]: time="2026-01-16T17:56:59.146134840Z" level=info msg=serving... address=/run/containerd/containerd.sock Jan 16 17:56:59.146173 containerd[1547]: time="2026-01-16T17:56:59.146080080Z" level=info msg="Start event monitor" Jan 16 17:56:59.146173 containerd[1547]: time="2026-01-16T17:56:59.146165600Z" level=info msg="Start cni network conf syncer for default" Jan 16 17:56:59.146225 containerd[1547]: time="2026-01-16T17:56:59.146174120Z" level=info msg="Start streaming server" Jan 16 17:56:59.146225 containerd[1547]: time="2026-01-16T17:56:59.146182560Z" level=info msg="Registered namespace \"k8s.io\" with NRI" Jan 16 17:56:59.146225 containerd[1547]: time="2026-01-16T17:56:59.146189080Z" level=info msg="runtime interface starting up..." Jan 16 17:56:59.146225 containerd[1547]: time="2026-01-16T17:56:59.146194920Z" level=info msg="starting plugins..." Jan 16 17:56:59.146225 containerd[1547]: time="2026-01-16T17:56:59.146210720Z" level=info msg="Synchronizing NRI (plugin) with current runtime state" Jan 16 17:56:59.146461 systemd[1]: Started containerd.service - containerd container runtime. Jan 16 17:56:59.149910 containerd[1547]: time="2026-01-16T17:56:59.148725640Z" level=info msg="containerd successfully booted in 0.280466s" Jan 16 17:56:59.334839 tar[1536]: linux-arm64/README.md Jan 16 17:56:59.354757 systemd-networkd[1470]: eth1: Gained IPv6LL Jan 16 17:56:59.357717 systemd-timesyncd[1440]: Network configuration changed, trying to establish connection. Jan 16 17:56:59.358342 systemd[1]: Finished prepare-helm.service - Unpack helm to /opt/bin. Jan 16 17:56:59.361722 systemd[1]: Finished systemd-networkd-wait-online.service - Wait for Network to be Configured. Jan 16 17:56:59.363132 systemd[1]: Reached target network-online.target - Network is Online. Jan 16 17:56:59.367753 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 16 17:56:59.372920 systemd[1]: Starting nvidia.service - NVIDIA Configure Service... Jan 16 17:56:59.416918 systemd[1]: Finished nvidia.service - NVIDIA Configure Service. Jan 16 17:56:59.559121 sshd_keygen[1569]: ssh-keygen: generating new host keys: RSA ECDSA ED25519 Jan 16 17:56:59.581936 systemd[1]: Finished sshd-keygen.service - Generate sshd host keys. Jan 16 17:56:59.589705 systemd[1]: Starting issuegen.service - Generate /run/issue... Jan 16 17:56:59.609256 systemd[1]: issuegen.service: Deactivated successfully. Jan 16 17:56:59.611659 systemd[1]: Finished issuegen.service - Generate /run/issue. Jan 16 17:56:59.616851 systemd[1]: Starting systemd-user-sessions.service - Permit User Sessions... Jan 16 17:56:59.635687 systemd[1]: Finished systemd-user-sessions.service - Permit User Sessions. Jan 16 17:56:59.639041 systemd[1]: Started getty@tty1.service - Getty on tty1. Jan 16 17:56:59.641484 systemd[1]: Started serial-getty@ttyAMA0.service - Serial Getty on ttyAMA0. Jan 16 17:56:59.644897 systemd[1]: Reached target getty.target - Login Prompts. Jan 16 17:56:59.674692 systemd-networkd[1470]: eth0: Gained IPv6LL Jan 16 17:56:59.675256 systemd-timesyncd[1440]: Network configuration changed, trying to establish connection. Jan 16 17:57:00.115880 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 17:57:00.118626 systemd[1]: Reached target multi-user.target - Multi-User System. Jan 16 17:57:00.119644 systemd[1]: Startup finished in 1.806s (kernel) + 6.703s (initrd) + 4.317s (userspace) = 12.828s. Jan 16 17:57:00.124158 (kubelet)[1701]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 16 17:57:00.572967 kubelet[1701]: E0116 17:57:00.572910 1701 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 16 17:57:00.575624 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 16 17:57:00.575786 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 16 17:57:00.576347 systemd[1]: kubelet.service: Consumed 804ms CPU time, 248M memory peak. Jan 16 17:57:10.689351 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 1. Jan 16 17:57:10.692521 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 16 17:57:10.862421 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 17:57:10.874035 (kubelet)[1720]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 16 17:57:10.929209 kubelet[1720]: E0116 17:57:10.929152 1720 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 16 17:57:10.932335 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 16 17:57:10.932620 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 16 17:57:10.933309 systemd[1]: kubelet.service: Consumed 177ms CPU time, 107.8M memory peak. Jan 16 17:57:20.939509 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 2. Jan 16 17:57:20.944891 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 16 17:57:21.103246 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 17:57:21.113179 (kubelet)[1735]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 16 17:57:21.156555 kubelet[1735]: E0116 17:57:21.156497 1735 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 16 17:57:21.159111 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 16 17:57:21.159319 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 16 17:57:21.161666 systemd[1]: kubelet.service: Consumed 162ms CPU time, 106.9M memory peak. Jan 16 17:57:29.900436 systemd-timesyncd[1440]: Contacted time server 188.68.34.173:123 (2.flatcar.pool.ntp.org). Jan 16 17:57:29.900540 systemd-timesyncd[1440]: Initial clock synchronization to Fri 2026-01-16 17:57:30.083800 UTC. Jan 16 17:57:31.189328 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 3. Jan 16 17:57:31.191118 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 16 17:57:31.370852 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 17:57:31.387153 (kubelet)[1750]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 16 17:57:31.438311 kubelet[1750]: E0116 17:57:31.438255 1750 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 16 17:57:31.441288 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 16 17:57:31.441453 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 16 17:57:31.442284 systemd[1]: kubelet.service: Consumed 182ms CPU time, 106M memory peak. Jan 16 17:57:33.167436 systemd[1]: Created slice system-sshd.slice - Slice /system/sshd. Jan 16 17:57:33.169424 systemd[1]: Started sshd@0-49.12.189.56:22-64.62.156.211:11497.service - OpenSSH per-connection server daemon (64.62.156.211:11497). Jan 16 17:57:33.819879 sshd[1757]: Invalid user from 64.62.156.211 port 11497 Jan 16 17:57:36.222831 systemd[1]: Started sshd@1-49.12.189.56:22-68.220.241.50:38998.service - OpenSSH per-connection server daemon (68.220.241.50:38998). Jan 16 17:57:36.786606 sshd[1761]: Accepted publickey for core from 68.220.241.50 port 38998 ssh2: RSA SHA256:9+z26MN7AwYt4+Fu/36id5j/He/wK/R0J/NBrgFPGqc Jan 16 17:57:36.790060 sshd-session[1761]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 17:57:36.805690 systemd[1]: Created slice user-500.slice - User Slice of UID 500. Jan 16 17:57:36.807780 systemd[1]: Starting user-runtime-dir@500.service - User Runtime Directory /run/user/500... Jan 16 17:57:36.812026 systemd-logind[1532]: New session 1 of user core. Jan 16 17:57:36.836592 systemd[1]: Finished user-runtime-dir@500.service - User Runtime Directory /run/user/500. Jan 16 17:57:36.839425 systemd[1]: Starting user@500.service - User Manager for UID 500... Jan 16 17:57:36.859852 (systemd)[1767]: pam_unix(systemd-user:session): session opened for user core(uid=500) by core(uid=0) Jan 16 17:57:36.863234 systemd-logind[1532]: New session 2 of user core. Jan 16 17:57:37.006630 systemd[1767]: Queued start job for default target default.target. Jan 16 17:57:37.020268 systemd[1767]: Created slice app.slice - User Application Slice. Jan 16 17:57:37.020325 systemd[1767]: Started systemd-tmpfiles-clean.timer - Daily Cleanup of User's Temporary Directories. Jan 16 17:57:37.020346 systemd[1767]: Reached target paths.target - Paths. Jan 16 17:57:37.020419 systemd[1767]: Reached target timers.target - Timers. Jan 16 17:57:37.022181 systemd[1767]: Starting dbus.socket - D-Bus User Message Bus Socket... Jan 16 17:57:37.025804 systemd[1767]: Starting systemd-tmpfiles-setup.service - Create User Files and Directories... Jan 16 17:57:37.042873 systemd[1767]: Listening on dbus.socket - D-Bus User Message Bus Socket. Jan 16 17:57:37.042965 systemd[1767]: Reached target sockets.target - Sockets. Jan 16 17:57:37.045409 systemd[1767]: Finished systemd-tmpfiles-setup.service - Create User Files and Directories. Jan 16 17:57:37.045483 systemd[1767]: Reached target basic.target - Basic System. Jan 16 17:57:37.045533 systemd[1767]: Reached target default.target - Main User Target. Jan 16 17:57:37.045579 systemd[1767]: Startup finished in 175ms. Jan 16 17:57:37.046064 systemd[1]: Started user@500.service - User Manager for UID 500. Jan 16 17:57:37.049779 systemd[1]: Started session-1.scope - Session 1 of User core. Jan 16 17:57:37.252039 sshd[1757]: Connection closed by invalid user 64.62.156.211 port 11497 [preauth] Jan 16 17:57:37.256320 systemd[1]: sshd@0-49.12.189.56:22-64.62.156.211:11497.service: Deactivated successfully. Jan 16 17:57:37.380691 systemd[1]: Started sshd@2-49.12.189.56:22-68.220.241.50:39000.service - OpenSSH per-connection server daemon (68.220.241.50:39000). Jan 16 17:57:37.973315 sshd[1783]: Accepted publickey for core from 68.220.241.50 port 39000 ssh2: RSA SHA256:9+z26MN7AwYt4+Fu/36id5j/He/wK/R0J/NBrgFPGqc Jan 16 17:57:37.974617 sshd-session[1783]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 17:57:37.980395 systemd-logind[1532]: New session 3 of user core. Jan 16 17:57:37.989302 systemd[1]: Started session-3.scope - Session 3 of User core. Jan 16 17:57:38.294322 sshd[1787]: Connection closed by 68.220.241.50 port 39000 Jan 16 17:57:38.293978 sshd-session[1783]: pam_unix(sshd:session): session closed for user core Jan 16 17:57:38.299988 systemd[1]: sshd@2-49.12.189.56:22-68.220.241.50:39000.service: Deactivated successfully. Jan 16 17:57:38.303331 systemd[1]: session-3.scope: Deactivated successfully. Jan 16 17:57:38.305112 systemd-logind[1532]: Session 3 logged out. Waiting for processes to exit. Jan 16 17:57:38.306916 systemd-logind[1532]: Removed session 3. Jan 16 17:57:38.396898 systemd[1]: Started sshd@3-49.12.189.56:22-68.220.241.50:39014.service - OpenSSH per-connection server daemon (68.220.241.50:39014). Jan 16 17:57:38.941455 sshd[1793]: Accepted publickey for core from 68.220.241.50 port 39014 ssh2: RSA SHA256:9+z26MN7AwYt4+Fu/36id5j/He/wK/R0J/NBrgFPGqc Jan 16 17:57:38.942764 sshd-session[1793]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 17:57:38.949856 systemd-logind[1532]: New session 4 of user core. Jan 16 17:57:38.955968 systemd[1]: Started session-4.scope - Session 4 of User core. Jan 16 17:57:39.232477 sshd[1797]: Connection closed by 68.220.241.50 port 39014 Jan 16 17:57:39.231379 sshd-session[1793]: pam_unix(sshd:session): session closed for user core Jan 16 17:57:39.239135 systemd[1]: sshd@3-49.12.189.56:22-68.220.241.50:39014.service: Deactivated successfully. Jan 16 17:57:39.241895 systemd[1]: session-4.scope: Deactivated successfully. Jan 16 17:57:39.244377 systemd-logind[1532]: Session 4 logged out. Waiting for processes to exit. Jan 16 17:57:39.246168 systemd-logind[1532]: Removed session 4. Jan 16 17:57:39.359168 systemd[1]: Started sshd@4-49.12.189.56:22-68.220.241.50:39018.service - OpenSSH per-connection server daemon (68.220.241.50:39018). Jan 16 17:57:39.928602 sshd[1803]: Accepted publickey for core from 68.220.241.50 port 39018 ssh2: RSA SHA256:9+z26MN7AwYt4+Fu/36id5j/He/wK/R0J/NBrgFPGqc Jan 16 17:57:39.930073 sshd-session[1803]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 17:57:39.935941 systemd-logind[1532]: New session 5 of user core. Jan 16 17:57:39.948994 systemd[1]: Started session-5.scope - Session 5 of User core. Jan 16 17:57:40.233366 sshd[1807]: Connection closed by 68.220.241.50 port 39018 Jan 16 17:57:40.233244 sshd-session[1803]: pam_unix(sshd:session): session closed for user core Jan 16 17:57:40.239226 systemd-logind[1532]: Session 5 logged out. Waiting for processes to exit. Jan 16 17:57:40.239605 systemd[1]: sshd@4-49.12.189.56:22-68.220.241.50:39018.service: Deactivated successfully. Jan 16 17:57:40.243126 systemd[1]: session-5.scope: Deactivated successfully. Jan 16 17:57:40.246542 systemd-logind[1532]: Removed session 5. Jan 16 17:57:40.354995 systemd[1]: Started sshd@5-49.12.189.56:22-68.220.241.50:39032.service - OpenSSH per-connection server daemon (68.220.241.50:39032). Jan 16 17:57:40.919445 sshd[1813]: Accepted publickey for core from 68.220.241.50 port 39032 ssh2: RSA SHA256:9+z26MN7AwYt4+Fu/36id5j/He/wK/R0J/NBrgFPGqc Jan 16 17:57:40.921168 sshd-session[1813]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 17:57:40.928574 systemd-logind[1532]: New session 6 of user core. Jan 16 17:57:40.937857 systemd[1]: Started session-6.scope - Session 6 of User core. Jan 16 17:57:41.138803 sudo[1818]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/setenforce 1 Jan 16 17:57:41.139083 sudo[1818]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 16 17:57:41.151130 sudo[1818]: pam_unix(sudo:session): session closed for user root Jan 16 17:57:41.252370 sshd[1817]: Connection closed by 68.220.241.50 port 39032 Jan 16 17:57:41.253675 sshd-session[1813]: pam_unix(sshd:session): session closed for user core Jan 16 17:57:41.260102 systemd-logind[1532]: Session 6 logged out. Waiting for processes to exit. Jan 16 17:57:41.260986 systemd[1]: sshd@5-49.12.189.56:22-68.220.241.50:39032.service: Deactivated successfully. Jan 16 17:57:41.264297 systemd[1]: session-6.scope: Deactivated successfully. Jan 16 17:57:41.267621 systemd-logind[1532]: Removed session 6. Jan 16 17:57:41.371934 systemd[1]: Started sshd@6-49.12.189.56:22-68.220.241.50:39040.service - OpenSSH per-connection server daemon (68.220.241.50:39040). Jan 16 17:57:41.689311 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 4. Jan 16 17:57:41.692796 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 16 17:57:41.874880 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 17:57:41.887440 (kubelet)[1836]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 16 17:57:41.935040 kubelet[1836]: E0116 17:57:41.934997 1836 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 16 17:57:41.938020 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 16 17:57:41.938162 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 16 17:57:41.938825 systemd[1]: kubelet.service: Consumed 165ms CPU time, 104.7M memory peak. Jan 16 17:57:41.955293 sshd[1825]: Accepted publickey for core from 68.220.241.50 port 39040 ssh2: RSA SHA256:9+z26MN7AwYt4+Fu/36id5j/He/wK/R0J/NBrgFPGqc Jan 16 17:57:41.957053 sshd-session[1825]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 17:57:41.963129 systemd-logind[1532]: New session 7 of user core. Jan 16 17:57:41.970949 systemd[1]: Started session-7.scope - Session 7 of User core. Jan 16 17:57:42.170312 sudo[1846]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/rm -rf /etc/audit/rules.d/80-selinux.rules /etc/audit/rules.d/99-default.rules Jan 16 17:57:42.170660 sudo[1846]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 16 17:57:42.175147 sudo[1846]: pam_unix(sudo:session): session closed for user root Jan 16 17:57:42.184987 sudo[1845]: core : PWD=/home/core ; USER=root ; COMMAND=/usr/sbin/systemctl restart audit-rules Jan 16 17:57:42.185264 sudo[1845]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 16 17:57:42.194731 systemd[1]: Starting audit-rules.service - Load Audit Rules... Jan 16 17:57:42.258586 kernel: kauditd_printk_skb: 178 callbacks suppressed Jan 16 17:57:42.258686 kernel: audit: type=1305 audit(1768586262.257:223): auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 16 17:57:42.257000 audit: CONFIG_CHANGE auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 op=remove_rule key=(null) list=5 res=1 Jan 16 17:57:42.258769 augenrules[1870]: No rules Jan 16 17:57:42.262823 kernel: audit: type=1300 audit(1768586262.257:223): arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffe0f9f0e0 a2=420 a3=0 items=0 ppid=1851 pid=1870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:42.257000 audit[1870]: SYSCALL arch=c00000b7 syscall=206 success=yes exit=1056 a0=3 a1=ffffe0f9f0e0 a2=420 a3=0 items=0 ppid=1851 pid=1870 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="auditctl" exe="/usr/bin/auditctl" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:42.260685 systemd[1]: audit-rules.service: Deactivated successfully. Jan 16 17:57:42.262604 systemd[1]: Finished audit-rules.service - Load Audit Rules. Jan 16 17:57:42.263769 sudo[1845]: pam_unix(sudo:session): session closed for user root Jan 16 17:57:42.266374 kernel: audit: type=1327 audit(1768586262.257:223): proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 16 17:57:42.257000 audit: PROCTITLE proctitle=2F7362696E2F617564697463746C002D52002F6574632F61756469742F61756469742E72756C6573 Jan 16 17:57:42.262000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:57:42.268559 kernel: audit: type=1130 audit(1768586262.262:224): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:57:42.263000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:57:42.263000 audit[1845]: USER_END pid=1845 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 17:57:42.271964 kernel: audit: type=1131 audit(1768586262.263:225): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=audit-rules comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:57:42.272019 kernel: audit: type=1106 audit(1768586262.263:226): pid=1845 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 17:57:42.272053 kernel: audit: type=1104 audit(1768586262.263:227): pid=1845 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 17:57:42.263000 audit[1845]: CRED_DISP pid=1845 uid=500 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 17:57:42.368586 sshd[1844]: Connection closed by 68.220.241.50 port 39040 Jan 16 17:57:42.367738 sshd-session[1825]: pam_unix(sshd:session): session closed for user core Jan 16 17:57:42.368000 audit[1825]: USER_END pid=1825 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 17:57:42.377765 systemd[1]: sshd@6-49.12.189.56:22-68.220.241.50:39040.service: Deactivated successfully. Jan 16 17:57:42.381607 kernel: audit: type=1106 audit(1768586262.368:228): pid=1825 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 17:57:42.381685 kernel: audit: type=1104 audit(1768586262.369:229): pid=1825 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 17:57:42.381703 kernel: audit: type=1131 audit(1768586262.376:230): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-49.12.189.56:22-68.220.241.50:39040 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:57:42.369000 audit[1825]: CRED_DISP pid=1825 uid=0 auid=500 ses=7 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 17:57:42.376000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@6-49.12.189.56:22-68.220.241.50:39040 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:57:42.380199 systemd[1]: session-7.scope: Deactivated successfully. Jan 16 17:57:42.383418 systemd-logind[1532]: Session 7 logged out. Waiting for processes to exit. Jan 16 17:57:42.384649 systemd-logind[1532]: Removed session 7. Jan 16 17:57:42.476164 systemd[1]: Started sshd@7-49.12.189.56:22-68.220.241.50:39474.service - OpenSSH per-connection server daemon (68.220.241.50:39474). Jan 16 17:57:42.475000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-49.12.189.56:22-68.220.241.50:39474 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:57:43.037000 audit[1879]: USER_ACCT pid=1879 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 17:57:43.039721 sshd[1879]: Accepted publickey for core from 68.220.241.50 port 39474 ssh2: RSA SHA256:9+z26MN7AwYt4+Fu/36id5j/He/wK/R0J/NBrgFPGqc Jan 16 17:57:43.040000 audit[1879]: CRED_ACQ pid=1879 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 17:57:43.040000 audit[1879]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffcc4f3ea0 a2=3 a3=0 items=0 ppid=1 pid=1879 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=8 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:43.040000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 17:57:43.043087 sshd-session[1879]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 17:57:43.048057 systemd-logind[1532]: New session 8 of user core. Jan 16 17:57:43.066956 systemd[1]: Started session-8.scope - Session 8 of User core. Jan 16 17:57:43.072000 audit[1879]: USER_START pid=1879 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 17:57:43.074000 audit[1883]: CRED_ACQ pid=1883 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 17:57:43.247000 audit[1884]: USER_ACCT pid=1884 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_unix,pam_faillock acct="core" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 17:57:43.249102 sudo[1884]: core : PWD=/home/core ; USER=root ; COMMAND=/home/core/install.sh Jan 16 17:57:43.248000 audit[1884]: CRED_REFR pid=1884 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 17:57:43.249880 sudo[1884]: pam_unix(sudo:session): session opened for user root(uid=0) by core(uid=500) Jan 16 17:57:43.248000 audit[1884]: USER_START pid=1884 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 17:57:43.292783 update_engine[1533]: I20260116 17:57:43.291853 1533 update_attempter.cc:509] Updating boot flags... Jan 16 17:57:43.607089 systemd[1]: Starting docker.service - Docker Application Container Engine... Jan 16 17:57:43.623118 (dockerd)[1918]: docker.service: Referenced but unset environment variable evaluates to an empty string: DOCKER_CGROUPS, DOCKER_OPTS, DOCKER_OPT_BIP, DOCKER_OPT_IPMASQ, DOCKER_OPT_MTU Jan 16 17:57:43.880797 dockerd[1918]: time="2026-01-16T17:57:43.879952172Z" level=info msg="Starting up" Jan 16 17:57:43.884964 dockerd[1918]: time="2026-01-16T17:57:43.884924238Z" level=info msg="OTEL tracing is not configured, using no-op tracer provider" Jan 16 17:57:43.898023 dockerd[1918]: time="2026-01-16T17:57:43.897918456Z" level=info msg="Creating a containerd client" address=/var/run/docker/libcontainerd/docker-containerd.sock timeout=1m0s Jan 16 17:57:43.924174 systemd[1]: var-lib-docker-metacopy\x2dcheck3012625423-merged.mount: Deactivated successfully. Jan 16 17:57:43.934941 dockerd[1918]: time="2026-01-16T17:57:43.934865230Z" level=info msg="Loading containers: start." Jan 16 17:57:43.946568 kernel: Initializing XFRM netlink socket Jan 16 17:57:44.006000 audit[1967]: NETFILTER_CFG table=nat:2 family=2 entries=2 op=nft_register_chain pid=1967 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:57:44.006000 audit[1967]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=fffffc0d12d0 a2=0 a3=0 items=0 ppid=1918 pid=1967 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.006000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 16 17:57:44.008000 audit[1969]: NETFILTER_CFG table=filter:3 family=2 entries=2 op=nft_register_chain pid=1969 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:57:44.008000 audit[1969]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=ffffc9500e20 a2=0 a3=0 items=0 ppid=1918 pid=1969 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.008000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 16 17:57:44.010000 audit[1971]: NETFILTER_CFG table=filter:4 family=2 entries=1 op=nft_register_chain pid=1971 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:57:44.010000 audit[1971]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd17ed6e0 a2=0 a3=0 items=0 ppid=1918 pid=1971 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.010000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 16 17:57:44.012000 audit[1973]: NETFILTER_CFG table=filter:5 family=2 entries=1 op=nft_register_chain pid=1973 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:57:44.012000 audit[1973]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe2ecda00 a2=0 a3=0 items=0 ppid=1918 pid=1973 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.012000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 16 17:57:44.015000 audit[1975]: NETFILTER_CFG table=filter:6 family=2 entries=1 op=nft_register_chain pid=1975 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:57:44.015000 audit[1975]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffcef8e7d0 a2=0 a3=0 items=0 ppid=1918 pid=1975 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.015000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 16 17:57:44.019000 audit[1977]: NETFILTER_CFG table=filter:7 family=2 entries=1 op=nft_register_chain pid=1977 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:57:44.019000 audit[1977]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffc4c793c0 a2=0 a3=0 items=0 ppid=1918 pid=1977 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.019000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 16 17:57:44.021000 audit[1979]: NETFILTER_CFG table=filter:8 family=2 entries=1 op=nft_register_chain pid=1979 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:57:44.021000 audit[1979]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=fffffbbad550 a2=0 a3=0 items=0 ppid=1918 pid=1979 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.021000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 16 17:57:44.023000 audit[1981]: NETFILTER_CFG table=nat:9 family=2 entries=2 op=nft_register_chain pid=1981 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:57:44.023000 audit[1981]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=ffffc96034c0 a2=0 a3=0 items=0 ppid=1918 pid=1981 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.023000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 16 17:57:44.061000 audit[1984]: NETFILTER_CFG table=nat:10 family=2 entries=2 op=nft_register_chain pid=1984 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:57:44.061000 audit[1984]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=472 a0=3 a1=ffffefc4a160 a2=0 a3=0 items=0 ppid=1918 pid=1984 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.061000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003132372E302E302E302F38 Jan 16 17:57:44.063000 audit[1986]: NETFILTER_CFG table=filter:11 family=2 entries=2 op=nft_register_chain pid=1986 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:57:44.063000 audit[1986]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffe64c2ae0 a2=0 a3=0 items=0 ppid=1918 pid=1986 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.063000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 16 17:57:44.066000 audit[1988]: NETFILTER_CFG table=filter:12 family=2 entries=1 op=nft_register_rule pid=1988 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:57:44.066000 audit[1988]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=ffffd1deff30 a2=0 a3=0 items=0 ppid=1918 pid=1988 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.066000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 16 17:57:44.069000 audit[1990]: NETFILTER_CFG table=filter:13 family=2 entries=1 op=nft_register_rule pid=1990 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:57:44.069000 audit[1990]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=ffffcc82fd20 a2=0 a3=0 items=0 ppid=1918 pid=1990 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.069000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 16 17:57:44.071000 audit[1992]: NETFILTER_CFG table=filter:14 family=2 entries=1 op=nft_register_rule pid=1992 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:57:44.071000 audit[1992]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=ffffef545ca0 a2=0 a3=0 items=0 ppid=1918 pid=1992 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.071000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 16 17:57:44.118000 audit[2022]: NETFILTER_CFG table=nat:15 family=10 entries=2 op=nft_register_chain pid=2022 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:57:44.118000 audit[2022]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=116 a0=3 a1=fffffa5c0970 a2=0 a3=0 items=0 ppid=1918 pid=2022 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.118000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4E00444F434B4552 Jan 16 17:57:44.121000 audit[2024]: NETFILTER_CFG table=filter:16 family=10 entries=2 op=nft_register_chain pid=2024 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:57:44.121000 audit[2024]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=124 a0=3 a1=fffff3aec300 a2=0 a3=0 items=0 ppid=1918 pid=2024 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.121000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B4552 Jan 16 17:57:44.124000 audit[2026]: NETFILTER_CFG table=filter:17 family=10 entries=1 op=nft_register_chain pid=2026 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:57:44.124000 audit[2026]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffffaa6a50 a2=0 a3=0 items=0 ppid=1918 pid=2026 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.124000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D464F5257415244 Jan 16 17:57:44.126000 audit[2028]: NETFILTER_CFG table=filter:18 family=10 entries=1 op=nft_register_chain pid=2028 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:57:44.126000 audit[2028]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc358cff0 a2=0 a3=0 items=0 ppid=1918 pid=2028 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.126000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D425249444745 Jan 16 17:57:44.128000 audit[2030]: NETFILTER_CFG table=filter:19 family=10 entries=1 op=nft_register_chain pid=2030 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:57:44.128000 audit[2030]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=fffff3959000 a2=0 a3=0 items=0 ppid=1918 pid=2030 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.128000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D4354 Jan 16 17:57:44.130000 audit[2032]: NETFILTER_CFG table=filter:20 family=10 entries=1 op=nft_register_chain pid=2032 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:57:44.130000 audit[2032]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffdfc195a0 a2=0 a3=0 items=0 ppid=1918 pid=2032 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.130000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 16 17:57:44.134000 audit[2034]: NETFILTER_CFG table=filter:21 family=10 entries=1 op=nft_register_chain pid=2034 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:57:44.134000 audit[2034]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=112 a0=3 a1=ffffc0299390 a2=0 a3=0 items=0 ppid=1918 pid=2034 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.134000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 16 17:57:44.137000 audit[2036]: NETFILTER_CFG table=nat:22 family=10 entries=2 op=nft_register_chain pid=2036 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:57:44.137000 audit[2036]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=384 a0=3 a1=ffffe853b140 a2=0 a3=0 items=0 ppid=1918 pid=2036 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.137000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D4100505245524F5554494E47002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B4552 Jan 16 17:57:44.141000 audit[2038]: NETFILTER_CFG table=nat:23 family=10 entries=2 op=nft_register_chain pid=2038 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:57:44.141000 audit[2038]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=484 a0=3 a1=ffffe2f93ad0 a2=0 a3=0 items=0 ppid=1918 pid=2038 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.141000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D74006E6174002D41004F5554505554002D6D006164647274797065002D2D6473742D74797065004C4F43414C002D6A00444F434B45520000002D2D647374003A3A312F313238 Jan 16 17:57:44.143000 audit[2040]: NETFILTER_CFG table=filter:24 family=10 entries=2 op=nft_register_chain pid=2040 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:57:44.143000 audit[2040]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffea92f340 a2=0 a3=0 items=0 ppid=1918 pid=2040 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.143000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D464F5257415244 Jan 16 17:57:44.146000 audit[2042]: NETFILTER_CFG table=filter:25 family=10 entries=1 op=nft_register_rule pid=2042 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:57:44.146000 audit[2042]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=236 a0=3 a1=ffffe304e2c0 a2=0 a3=0 items=0 ppid=1918 pid=2042 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.146000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D425249444745 Jan 16 17:57:44.148000 audit[2044]: NETFILTER_CFG table=filter:26 family=10 entries=1 op=nft_register_rule pid=2044 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:57:44.148000 audit[2044]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=248 a0=3 a1=ffffcad18c20 a2=0 a3=0 items=0 ppid=1918 pid=2044 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.148000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D31 Jan 16 17:57:44.150000 audit[2046]: NETFILTER_CFG table=filter:27 family=10 entries=1 op=nft_register_rule pid=2046 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:57:44.150000 audit[2046]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=232 a0=3 a1=ffffdbef2ed0 a2=0 a3=0 items=0 ppid=1918 pid=2046 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.150000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900444F434B45522D464F5257415244002D6A00444F434B45522D4354 Jan 16 17:57:44.156000 audit[2051]: NETFILTER_CFG table=filter:28 family=2 entries=1 op=nft_register_chain pid=2051 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:57:44.156000 audit[2051]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffee452290 a2=0 a3=0 items=0 ppid=1918 pid=2051 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.156000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 16 17:57:44.158000 audit[2053]: NETFILTER_CFG table=filter:29 family=2 entries=1 op=nft_register_rule pid=2053 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:57:44.158000 audit[2053]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=ffffd4954130 a2=0 a3=0 items=0 ppid=1918 pid=2053 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.158000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 16 17:57:44.161000 audit[2055]: NETFILTER_CFG table=filter:30 family=2 entries=1 op=nft_register_rule pid=2055 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:57:44.161000 audit[2055]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=ffffed19dba0 a2=0 a3=0 items=0 ppid=1918 pid=2055 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.161000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 16 17:57:44.163000 audit[2057]: NETFILTER_CFG table=filter:31 family=10 entries=1 op=nft_register_chain pid=2057 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:57:44.163000 audit[2057]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffe90d9720 a2=0 a3=0 items=0 ppid=1918 pid=2057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.163000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D740066696C746572002D4E00444F434B45522D55534552 Jan 16 17:57:44.169000 audit[2059]: NETFILTER_CFG table=filter:32 family=10 entries=1 op=nft_register_rule pid=2059 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:57:44.169000 audit[2059]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=212 a0=3 a1=ffffcaace490 a2=0 a3=0 items=0 ppid=1918 pid=2059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.169000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4100444F434B45522D55534552002D6A0052455455524E Jan 16 17:57:44.172000 audit[2061]: NETFILTER_CFG table=filter:33 family=10 entries=1 op=nft_register_rule pid=2061 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:57:44.172000 audit[2061]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=224 a0=3 a1=fffff371e9f0 a2=0 a3=0 items=0 ppid=1918 pid=2061 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.172000 audit: PROCTITLE proctitle=2F7573722F62696E2F6970367461626C6573002D2D77616974002D4900464F5257415244002D6A00444F434B45522D55534552 Jan 16 17:57:44.194000 audit[2065]: NETFILTER_CFG table=nat:34 family=2 entries=2 op=nft_register_chain pid=2065 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:57:44.194000 audit[2065]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=520 a0=3 a1=ffffe8129280 a2=0 a3=0 items=0 ppid=1918 pid=2065 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.194000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900504F5354524F5554494E47002D73003137322E31372E302E302F31360000002D6F00646F636B657230002D6A004D415351554552414445 Jan 16 17:57:44.196000 audit[2067]: NETFILTER_CFG table=nat:35 family=2 entries=1 op=nft_register_rule pid=2067 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:57:44.196000 audit[2067]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=288 a0=3 a1=ffffc4352a10 a2=0 a3=0 items=0 ppid=1918 pid=2067 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.196000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D74006E6174002D4900444F434B4552002D6900646F636B657230002D6A0052455455524E Jan 16 17:57:44.210000 audit[2075]: NETFILTER_CFG table=filter:36 family=2 entries=1 op=nft_register_rule pid=2075 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:57:44.210000 audit[2075]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=300 a0=3 a1=fffff583b090 a2=0 a3=0 items=0 ppid=1918 pid=2075 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.210000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D464F5257415244002D6900646F636B657230002D6A00414343455054 Jan 16 17:57:44.224000 audit[2081]: NETFILTER_CFG table=filter:37 family=2 entries=1 op=nft_register_rule pid=2081 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:57:44.224000 audit[2081]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=376 a0=3 a1=ffffeff29a60 a2=0 a3=0 items=0 ppid=1918 pid=2081 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.224000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45520000002D6900646F636B657230002D6F00646F636B657230002D6A0044524F50 Jan 16 17:57:44.227000 audit[2083]: NETFILTER_CFG table=filter:38 family=2 entries=1 op=nft_register_rule pid=2083 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:57:44.227000 audit[2083]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=512 a0=3 a1=fffffbf83760 a2=0 a3=0 items=0 ppid=1918 pid=2083 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.227000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D4354002D6F00646F636B657230002D6D00636F6E6E747261636B002D2D637473746174650052454C415445442C45535441424C4953484544002D6A00414343455054 Jan 16 17:57:44.229000 audit[2085]: NETFILTER_CFG table=filter:39 family=2 entries=1 op=nft_register_rule pid=2085 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:57:44.229000 audit[2085]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffebc12730 a2=0 a3=0 items=0 ppid=1918 pid=2085 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.229000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D425249444745002D6F00646F636B657230002D6A00444F434B4552 Jan 16 17:57:44.231000 audit[2087]: NETFILTER_CFG table=filter:40 family=2 entries=1 op=nft_register_rule pid=2087 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:57:44.231000 audit[2087]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=428 a0=3 a1=fffff56f6bc0 a2=0 a3=0 items=0 ppid=1918 pid=2087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.231000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4100444F434B45522D49534F4C4154494F4E2D53544147452D31002D6900646F636B6572300000002D6F00646F636B657230002D6A00444F434B45522D49534F4C4154494F4E2D53544147452D32 Jan 16 17:57:44.233000 audit[2089]: NETFILTER_CFG table=filter:41 family=2 entries=1 op=nft_register_rule pid=2089 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:57:44.233000 audit[2089]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=312 a0=3 a1=ffffde089320 a2=0 a3=0 items=0 ppid=1918 pid=2089 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:57:44.233000 audit: PROCTITLE proctitle=2F7573722F62696E2F69707461626C6573002D2D77616974002D740066696C746572002D4900444F434B45522D49534F4C4154494F4E2D53544147452D32002D6F00646F636B657230002D6A0044524F50 Jan 16 17:57:44.235422 systemd-networkd[1470]: docker0: Link UP Jan 16 17:57:44.243538 dockerd[1918]: time="2026-01-16T17:57:44.243468809Z" level=info msg="Loading containers: done." Jan 16 17:57:44.261183 systemd[1]: var-lib-docker-overlay2-opaque\x2dbug\x2dcheck3863417913-merged.mount: Deactivated successfully. Jan 16 17:57:44.273921 dockerd[1918]: time="2026-01-16T17:57:44.273854150Z" level=warning msg="Not using native diff for overlay2, this may cause degraded performance for building images: kernel has CONFIG_OVERLAY_FS_REDIRECT_DIR enabled" storage-driver=overlay2 Jan 16 17:57:44.274691 dockerd[1918]: time="2026-01-16T17:57:44.274080147Z" level=info msg="Docker daemon" commit=6430e49a55babd9b8f4d08e70ecb2b68900770fe containerd-snapshotter=false storage-driver=overlay2 version=28.0.4 Jan 16 17:57:44.274691 dockerd[1918]: time="2026-01-16T17:57:44.274293820Z" level=info msg="Initializing buildkit" Jan 16 17:57:44.299243 dockerd[1918]: time="2026-01-16T17:57:44.299205220Z" level=info msg="Completed buildkit initialization" Jan 16 17:57:44.308428 dockerd[1918]: time="2026-01-16T17:57:44.308383785Z" level=info msg="Daemon has completed initialization" Jan 16 17:57:44.309047 dockerd[1918]: time="2026-01-16T17:57:44.308894786Z" level=info msg="API listen on /run/docker.sock" Jan 16 17:57:44.308730 systemd[1]: Started docker.service - Docker Application Container Engine. Jan 16 17:57:44.307000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=docker comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:57:45.397897 containerd[1547]: time="2026-01-16T17:57:45.397816213Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\"" Jan 16 17:57:46.049851 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount711571936.mount: Deactivated successfully. Jan 16 17:57:46.923095 containerd[1547]: time="2026-01-16T17:57:46.923007172Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:57:46.925220 containerd[1547]: time="2026-01-16T17:57:46.924819228Z" level=info msg="stop pulling image registry.k8s.io/kube-apiserver:v1.34.3: active requests=0, bytes read=22974850" Jan 16 17:57:46.926781 containerd[1547]: time="2026-01-16T17:57:46.926729509Z" level=info msg="ImageCreate event name:\"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:57:46.931519 containerd[1547]: time="2026-01-16T17:57:46.931469637Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:57:46.932665 containerd[1547]: time="2026-01-16T17:57:46.932627525Z" level=info msg="Pulled image \"registry.k8s.io/kube-apiserver:v1.34.3\" with image id \"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\", repo tag \"registry.k8s.io/kube-apiserver:v1.34.3\", repo digest \"registry.k8s.io/kube-apiserver@sha256:5af1030676ceca025742ef5e73a504d11b59be0e5551cdb8c9cf0d3c1231b460\", size \"24567639\" in 1.534719553s" Jan 16 17:57:46.932747 containerd[1547]: time="2026-01-16T17:57:46.932669198Z" level=info msg="PullImage \"registry.k8s.io/kube-apiserver:v1.34.3\" returns image reference \"sha256:cf65ae6c8f700cc27f57b7305c6e2b71276a7eed943c559a0091e1e667169896\"" Jan 16 17:57:46.933139 containerd[1547]: time="2026-01-16T17:57:46.933112355Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\"" Jan 16 17:57:48.503903 containerd[1547]: time="2026-01-16T17:57:48.503851563Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:57:48.505240 containerd[1547]: time="2026-01-16T17:57:48.504907388Z" level=info msg="stop pulling image registry.k8s.io/kube-controller-manager:v1.34.3: active requests=0, bytes read=19127323" Jan 16 17:57:48.506148 containerd[1547]: time="2026-01-16T17:57:48.506115850Z" level=info msg="ImageCreate event name:\"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:57:48.509399 containerd[1547]: time="2026-01-16T17:57:48.509366699Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:57:48.510485 containerd[1547]: time="2026-01-16T17:57:48.510456314Z" level=info msg="Pulled image \"registry.k8s.io/kube-controller-manager:v1.34.3\" with image id \"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\", repo tag \"registry.k8s.io/kube-controller-manager:v1.34.3\", repo digest \"registry.k8s.io/kube-controller-manager@sha256:716a210d31ee5e27053ea0e1a3a3deb4910791a85ba4b1120410b5a4cbcf1954\", size \"20719958\" in 1.577311838s" Jan 16 17:57:48.510660 containerd[1547]: time="2026-01-16T17:57:48.510640656Z" level=info msg="PullImage \"registry.k8s.io/kube-controller-manager:v1.34.3\" returns image reference \"sha256:7ada8ff13e54bf42ca66f146b54cd7b1757797d93b3b9ba06df034cdddb5ab22\"" Jan 16 17:57:48.511184 containerd[1547]: time="2026-01-16T17:57:48.511072149Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\"" Jan 16 17:57:49.546392 containerd[1547]: time="2026-01-16T17:57:49.546305463Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:57:49.548458 containerd[1547]: time="2026-01-16T17:57:49.548397533Z" level=info msg="stop pulling image registry.k8s.io/kube-scheduler:v1.34.3: active requests=0, bytes read=14183580" Jan 16 17:57:49.549446 containerd[1547]: time="2026-01-16T17:57:49.549381957Z" level=info msg="ImageCreate event name:\"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:57:49.553373 containerd[1547]: time="2026-01-16T17:57:49.553307509Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:57:49.554665 containerd[1547]: time="2026-01-16T17:57:49.554604178Z" level=info msg="Pulled image \"registry.k8s.io/kube-scheduler:v1.34.3\" with image id \"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\", repo tag \"registry.k8s.io/kube-scheduler:v1.34.3\", repo digest \"registry.k8s.io/kube-scheduler@sha256:f9a9bc7948fd804ef02255fe82ac2e85d2a66534bae2fe1348c14849260a1fe2\", size \"15776215\" in 1.04326604s" Jan 16 17:57:49.554665 containerd[1547]: time="2026-01-16T17:57:49.554647977Z" level=info msg="PullImage \"registry.k8s.io/kube-scheduler:v1.34.3\" returns image reference \"sha256:2f2aa21d34d2db37a290752f34faf1d41087c02e18aa9d046a8b4ba1e29421a6\"" Jan 16 17:57:49.555190 containerd[1547]: time="2026-01-16T17:57:49.555137464Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\"" Jan 16 17:57:50.506674 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount958916704.mount: Deactivated successfully. Jan 16 17:57:50.758905 containerd[1547]: time="2026-01-16T17:57:50.758757245Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy:v1.34.3\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:57:50.760609 containerd[1547]: time="2026-01-16T17:57:50.760516434Z" level=info msg="stop pulling image registry.k8s.io/kube-proxy:v1.34.3: active requests=0, bytes read=0" Jan 16 17:57:50.761936 containerd[1547]: time="2026-01-16T17:57:50.761873185Z" level=info msg="ImageCreate event name:\"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:57:50.765841 containerd[1547]: time="2026-01-16T17:57:50.765784907Z" level=info msg="ImageCreate event name:\"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:57:50.767246 containerd[1547]: time="2026-01-16T17:57:50.766698035Z" level=info msg="Pulled image \"registry.k8s.io/kube-proxy:v1.34.3\" with image id \"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\", repo tag \"registry.k8s.io/kube-proxy:v1.34.3\", repo digest \"registry.k8s.io/kube-proxy@sha256:7298ab89a103523d02ff4f49bedf9359710af61df92efdc07bac873064f03ed6\", size \"22804272\" in 1.211535168s" Jan 16 17:57:50.767246 containerd[1547]: time="2026-01-16T17:57:50.766745551Z" level=info msg="PullImage \"registry.k8s.io/kube-proxy:v1.34.3\" returns image reference \"sha256:4461daf6b6af87cf200fc22cecc9a2120959aabaf5712ba54ef5b4a6361d1162\"" Jan 16 17:57:50.767739 containerd[1547]: time="2026-01-16T17:57:50.767710320Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\"" Jan 16 17:57:51.401534 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3721343355.mount: Deactivated successfully. Jan 16 17:57:52.082497 containerd[1547]: time="2026-01-16T17:57:52.082429031Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns:v1.12.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:57:52.085067 containerd[1547]: time="2026-01-16T17:57:52.084973000Z" level=info msg="stop pulling image registry.k8s.io/coredns/coredns:v1.12.1: active requests=0, bytes read=19575910" Jan 16 17:57:52.085835 containerd[1547]: time="2026-01-16T17:57:52.085747701Z" level=info msg="ImageCreate event name:\"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:57:52.089609 containerd[1547]: time="2026-01-16T17:57:52.089345030Z" level=info msg="ImageCreate event name:\"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:57:52.092270 containerd[1547]: time="2026-01-16T17:57:52.092055762Z" level=info msg="Pulled image \"registry.k8s.io/coredns/coredns:v1.12.1\" with image id \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\", repo tag \"registry.k8s.io/coredns/coredns:v1.12.1\", repo digest \"registry.k8s.io/coredns/coredns@sha256:e8c262566636e6bc340ece6473b0eed193cad045384401529721ddbe6463d31c\", size \"20392204\" in 1.324303661s" Jan 16 17:57:52.092270 containerd[1547]: time="2026-01-16T17:57:52.092110709Z" level=info msg="PullImage \"registry.k8s.io/coredns/coredns:v1.12.1\" returns image reference \"sha256:138784d87c9c50f8e59412544da4cf4928d61ccbaf93b9f5898a3ba406871bfc\"" Jan 16 17:57:52.092661 containerd[1547]: time="2026-01-16T17:57:52.092626255Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\"" Jan 16 17:57:52.189130 systemd[1]: kubelet.service: Scheduled restart job, restart counter is at 5. Jan 16 17:57:52.191916 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 16 17:57:52.360326 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 17:57:52.362530 kernel: kauditd_printk_skb: 132 callbacks suppressed Jan 16 17:57:52.362642 kernel: audit: type=1130 audit(1768586272.358:281): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:57:52.358000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:57:52.374102 (kubelet)[2264]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS, KUBELET_KUBEADM_ARGS Jan 16 17:57:52.421842 kubelet[2264]: E0116 17:57:52.421795 2264 run.go:72] "command failed" err="failed to load kubelet config file, path: /var/lib/kubelet/config.yaml, error: failed to load Kubelet config file /var/lib/kubelet/config.yaml, error failed to read kubelet config file \"/var/lib/kubelet/config.yaml\", error: open /var/lib/kubelet/config.yaml: no such file or directory" Jan 16 17:57:52.426005 systemd[1]: kubelet.service: Main process exited, code=exited, status=1/FAILURE Jan 16 17:57:52.426154 systemd[1]: kubelet.service: Failed with result 'exit-code'. Jan 16 17:57:52.425000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 16 17:57:52.426854 systemd[1]: kubelet.service: Consumed 165ms CPU time, 106.7M memory peak. Jan 16 17:57:52.430584 kernel: audit: type=1131 audit(1768586272.425:282): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 16 17:57:52.620395 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount467559490.mount: Deactivated successfully. Jan 16 17:57:52.627583 containerd[1547]: time="2026-01-16T17:57:52.626894273Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10.1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:57:52.628723 containerd[1547]: time="2026-01-16T17:57:52.628675596Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10.1: active requests=0, bytes read=0" Jan 16 17:57:52.630169 containerd[1547]: time="2026-01-16T17:57:52.630143539Z" level=info msg="ImageCreate event name:\"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:57:52.632967 containerd[1547]: time="2026-01-16T17:57:52.632918348Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:57:52.633785 containerd[1547]: time="2026-01-16T17:57:52.633748717Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10.1\" with image id \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\", repo tag \"registry.k8s.io/pause:3.10.1\", repo digest \"registry.k8s.io/pause@sha256:278fb9dbcca9518083ad1e11276933a2e96f23de604a3a08cc3c80002767d24c\", size \"267939\" in 541.083615ms" Jan 16 17:57:52.633785 containerd[1547]: time="2026-01-16T17:57:52.633780395Z" level=info msg="PullImage \"registry.k8s.io/pause:3.10.1\" returns image reference \"sha256:d7b100cd9a77ba782c5e428c8dd5a1df4a1e79d4cb6294acd7d01290ab3babbd\"" Jan 16 17:57:52.634279 containerd[1547]: time="2026-01-16T17:57:52.634234867Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\"" Jan 16 17:57:53.433627 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2866112974.mount: Deactivated successfully. Jan 16 17:57:55.886580 containerd[1547]: time="2026-01-16T17:57:55.885509269Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd:3.6.4-0\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:57:55.887597 containerd[1547]: time="2026-01-16T17:57:55.887521667Z" level=info msg="stop pulling image registry.k8s.io/etcd:3.6.4-0: active requests=0, bytes read=96314798" Jan 16 17:57:55.887913 containerd[1547]: time="2026-01-16T17:57:55.887882361Z" level=info msg="ImageCreate event name:\"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:57:55.891325 containerd[1547]: time="2026-01-16T17:57:55.891285851Z" level=info msg="ImageCreate event name:\"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:57:55.892641 containerd[1547]: time="2026-01-16T17:57:55.892607006Z" level=info msg="Pulled image \"registry.k8s.io/etcd:3.6.4-0\" with image id \"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\", repo tag \"registry.k8s.io/etcd:3.6.4-0\", repo digest \"registry.k8s.io/etcd@sha256:e36c081683425b5b3bc1425bc508b37e7107bb65dfa9367bf5a80125d431fa19\", size \"98207481\" in 3.257592155s" Jan 16 17:57:55.892641 containerd[1547]: time="2026-01-16T17:57:55.892639833Z" level=info msg="PullImage \"registry.k8s.io/etcd:3.6.4-0\" returns image reference \"sha256:a1894772a478e07c67a56e8bf32335fdbe1dd4ec96976a5987083164bd00bc0e\"" Jan 16 17:58:01.458704 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 17:58:01.459000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:01.462288 systemd[1]: kubelet.service: Consumed 165ms CPU time, 106.7M memory peak. Jan 16 17:58:01.461000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:01.466115 kernel: audit: type=1130 audit(1768586281.459:283): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:01.466239 kernel: audit: type=1131 audit(1768586281.461:284): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:01.467846 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 16 17:58:01.511206 systemd[1]: Reload requested from client PID 2356 ('systemctl') (unit session-8.scope)... Jan 16 17:58:01.511225 systemd[1]: Reloading... Jan 16 17:58:01.637578 zram_generator::config[2402]: No configuration found. Jan 16 17:58:01.837865 systemd[1]: Reloading finished in 326 ms. Jan 16 17:58:01.865625 kernel: audit: type=1334 audit(1768586281.862:285): prog-id=61 op=LOAD Jan 16 17:58:01.865721 kernel: audit: type=1334 audit(1768586281.862:286): prog-id=47 op=UNLOAD Jan 16 17:58:01.865748 kernel: audit: type=1334 audit(1768586281.863:287): prog-id=62 op=LOAD Jan 16 17:58:01.862000 audit: BPF prog-id=61 op=LOAD Jan 16 17:58:01.862000 audit: BPF prog-id=47 op=UNLOAD Jan 16 17:58:01.863000 audit: BPF prog-id=62 op=LOAD Jan 16 17:58:01.866721 kernel: audit: type=1334 audit(1768586281.863:288): prog-id=63 op=LOAD Jan 16 17:58:01.863000 audit: BPF prog-id=63 op=LOAD Jan 16 17:58:01.867911 kernel: audit: type=1334 audit(1768586281.863:289): prog-id=48 op=UNLOAD Jan 16 17:58:01.867980 kernel: audit: type=1334 audit(1768586281.863:290): prog-id=49 op=UNLOAD Jan 16 17:58:01.863000 audit: BPF prog-id=48 op=UNLOAD Jan 16 17:58:01.863000 audit: BPF prog-id=49 op=UNLOAD Jan 16 17:58:01.865000 audit: BPF prog-id=64 op=LOAD Jan 16 17:58:01.865000 audit: BPF prog-id=65 op=LOAD Jan 16 17:58:01.869620 kernel: audit: type=1334 audit(1768586281.865:291): prog-id=64 op=LOAD Jan 16 17:58:01.869670 kernel: audit: type=1334 audit(1768586281.865:292): prog-id=65 op=LOAD Jan 16 17:58:01.865000 audit: BPF prog-id=54 op=UNLOAD Jan 16 17:58:01.865000 audit: BPF prog-id=55 op=UNLOAD Jan 16 17:58:01.866000 audit: BPF prog-id=66 op=LOAD Jan 16 17:58:01.866000 audit: BPF prog-id=53 op=UNLOAD Jan 16 17:58:01.866000 audit: BPF prog-id=67 op=LOAD Jan 16 17:58:01.866000 audit: BPF prog-id=44 op=UNLOAD Jan 16 17:58:01.868000 audit: BPF prog-id=68 op=LOAD Jan 16 17:58:01.868000 audit: BPF prog-id=69 op=LOAD Jan 16 17:58:01.868000 audit: BPF prog-id=45 op=UNLOAD Jan 16 17:58:01.868000 audit: BPF prog-id=46 op=UNLOAD Jan 16 17:58:01.872000 audit: BPF prog-id=70 op=LOAD Jan 16 17:58:01.872000 audit: BPF prog-id=50 op=UNLOAD Jan 16 17:58:01.873000 audit: BPF prog-id=71 op=LOAD Jan 16 17:58:01.873000 audit: BPF prog-id=72 op=LOAD Jan 16 17:58:01.873000 audit: BPF prog-id=51 op=UNLOAD Jan 16 17:58:01.873000 audit: BPF prog-id=52 op=UNLOAD Jan 16 17:58:01.875000 audit: BPF prog-id=73 op=LOAD Jan 16 17:58:01.875000 audit: BPF prog-id=58 op=UNLOAD Jan 16 17:58:01.875000 audit: BPF prog-id=74 op=LOAD Jan 16 17:58:01.875000 audit: BPF prog-id=75 op=LOAD Jan 16 17:58:01.875000 audit: BPF prog-id=59 op=UNLOAD Jan 16 17:58:01.875000 audit: BPF prog-id=60 op=UNLOAD Jan 16 17:58:01.875000 audit: BPF prog-id=76 op=LOAD Jan 16 17:58:01.875000 audit: BPF prog-id=56 op=UNLOAD Jan 16 17:58:01.881000 audit: BPF prog-id=77 op=LOAD Jan 16 17:58:01.881000 audit: BPF prog-id=41 op=UNLOAD Jan 16 17:58:01.881000 audit: BPF prog-id=78 op=LOAD Jan 16 17:58:01.881000 audit: BPF prog-id=79 op=LOAD Jan 16 17:58:01.881000 audit: BPF prog-id=42 op=UNLOAD Jan 16 17:58:01.881000 audit: BPF prog-id=43 op=UNLOAD Jan 16 17:58:01.883000 audit: BPF prog-id=80 op=LOAD Jan 16 17:58:01.883000 audit: BPF prog-id=57 op=UNLOAD Jan 16 17:58:01.899514 systemd[1]: kubelet.service: Control process exited, code=killed, status=15/TERM Jan 16 17:58:01.899956 systemd[1]: kubelet.service: Failed with result 'signal'. Jan 16 17:58:01.900353 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 17:58:01.899000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=failed' Jan 16 17:58:01.900423 systemd[1]: kubelet.service: Consumed 108ms CPU time, 95.1M memory peak. Jan 16 17:58:01.902513 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 16 17:58:02.054049 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 17:58:02.053000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:02.064882 (kubelet)[2451]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 16 17:58:02.107712 kubelet[2451]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 16 17:58:02.107712 kubelet[2451]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 16 17:58:02.107712 kubelet[2451]: I0116 17:58:02.106886 2451 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 16 17:58:02.640584 kubelet[2451]: I0116 17:58:02.639746 2451 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Jan 16 17:58:02.640584 kubelet[2451]: I0116 17:58:02.639780 2451 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 16 17:58:02.640584 kubelet[2451]: I0116 17:58:02.639803 2451 watchdog_linux.go:95] "Systemd watchdog is not enabled" Jan 16 17:58:02.640584 kubelet[2451]: I0116 17:58:02.639809 2451 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 16 17:58:02.640584 kubelet[2451]: I0116 17:58:02.640049 2451 server.go:956] "Client rotation is on, will bootstrap in background" Jan 16 17:58:02.649037 kubelet[2451]: E0116 17:58:02.648995 2451 certificate_manager.go:596] "Failed while requesting a signed certificate from the control plane" err="cannot create certificate signing request: Post \"https://49.12.189.56:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 49.12.189.56:6443: connect: connection refused" logger="kubernetes.io/kube-apiserver-client-kubelet.UnhandledError" Jan 16 17:58:02.650527 kubelet[2451]: I0116 17:58:02.650480 2451 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 16 17:58:02.657087 kubelet[2451]: I0116 17:58:02.657035 2451 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 16 17:58:02.660654 kubelet[2451]: I0116 17:58:02.660599 2451 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Jan 16 17:58:02.660869 kubelet[2451]: I0116 17:58:02.660841 2451 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 16 17:58:02.661038 kubelet[2451]: I0116 17:58:02.660870 2451 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4580-0-0-p-03fd9ab712","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 16 17:58:02.661165 kubelet[2451]: I0116 17:58:02.661041 2451 topology_manager.go:138] "Creating topology manager with none policy" Jan 16 17:58:02.661165 kubelet[2451]: I0116 17:58:02.661050 2451 container_manager_linux.go:306] "Creating device plugin manager" Jan 16 17:58:02.661217 kubelet[2451]: I0116 17:58:02.661172 2451 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Jan 16 17:58:02.665060 kubelet[2451]: I0116 17:58:02.665023 2451 state_mem.go:36] "Initialized new in-memory state store" Jan 16 17:58:02.666926 kubelet[2451]: I0116 17:58:02.666711 2451 kubelet.go:475] "Attempting to sync node with API server" Jan 16 17:58:02.666926 kubelet[2451]: I0116 17:58:02.666740 2451 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 16 17:58:02.666926 kubelet[2451]: I0116 17:58:02.666768 2451 kubelet.go:387] "Adding apiserver pod source" Jan 16 17:58:02.666926 kubelet[2451]: I0116 17:58:02.666778 2451 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 16 17:58:02.667595 kubelet[2451]: E0116 17:58:02.667529 2451 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://49.12.189.56:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4580-0-0-p-03fd9ab712&limit=500&resourceVersion=0\": dial tcp 49.12.189.56:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Jan 16 17:58:02.668451 kubelet[2451]: E0116 17:58:02.668085 2451 reflector.go:205] "Failed to watch" err="failed to list *v1.Service: Get \"https://49.12.189.56:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 49.12.189.56:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Jan 16 17:58:02.668688 kubelet[2451]: I0116 17:58:02.668669 2451 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 16 17:58:02.669514 kubelet[2451]: I0116 17:58:02.669492 2451 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 16 17:58:02.669652 kubelet[2451]: I0116 17:58:02.669639 2451 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Jan 16 17:58:02.669756 kubelet[2451]: W0116 17:58:02.669745 2451 probe.go:272] Flexvolume plugin directory at /opt/libexec/kubernetes/kubelet-plugins/volume/exec/ does not exist. Recreating. Jan 16 17:58:02.673937 kubelet[2451]: I0116 17:58:02.673917 2451 server.go:1262] "Started kubelet" Jan 16 17:58:02.676649 kubelet[2451]: I0116 17:58:02.676623 2451 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 16 17:58:02.679592 kubelet[2451]: E0116 17:58:02.678336 2451 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://49.12.189.56:6443/api/v1/namespaces/default/events\": dial tcp 49.12.189.56:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4580-0-0-p-03fd9ab712.188b47da8d67dbeb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4580-0-0-p-03fd9ab712,UID:ci-4580-0-0-p-03fd9ab712,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4580-0-0-p-03fd9ab712,},FirstTimestamp:2026-01-16 17:58:02.673888235 +0000 UTC m=+0.604547570,LastTimestamp:2026-01-16 17:58:02.673888235 +0000 UTC m=+0.604547570,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4580-0-0-p-03fd9ab712,}" Jan 16 17:58:02.681190 kubelet[2451]: I0116 17:58:02.681113 2451 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 16 17:58:02.682035 kubelet[2451]: I0116 17:58:02.682000 2451 server.go:310] "Adding debug handlers to kubelet server" Jan 16 17:58:02.682000 audit[2465]: NETFILTER_CFG table=mangle:42 family=2 entries=2 op=nft_register_chain pid=2465 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:02.682000 audit[2465]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=fffff9e14a50 a2=0 a3=0 items=0 ppid=2451 pid=2465 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:02.682000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 16 17:58:02.683000 audit[2467]: NETFILTER_CFG table=filter:43 family=2 entries=1 op=nft_register_chain pid=2467 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:02.683000 audit[2467]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffce7e38b0 a2=0 a3=0 items=0 ppid=2451 pid=2467 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:02.683000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 16 17:58:02.685879 kubelet[2451]: I0116 17:58:02.685802 2451 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 16 17:58:02.685879 kubelet[2451]: I0116 17:58:02.685871 2451 server_v1.go:49] "podresources" method="list" useActivePods=true Jan 16 17:58:02.686081 kubelet[2451]: I0116 17:58:02.686054 2451 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 16 17:58:02.686677 kubelet[2451]: I0116 17:58:02.686361 2451 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 16 17:58:02.687233 kubelet[2451]: I0116 17:58:02.687214 2451 volume_manager.go:313] "Starting Kubelet Volume Manager" Jan 16 17:58:02.687543 kubelet[2451]: E0116 17:58:02.687521 2451 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4580-0-0-p-03fd9ab712\" not found" Jan 16 17:58:02.687000 audit[2469]: NETFILTER_CFG table=filter:44 family=2 entries=2 op=nft_register_chain pid=2469 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:02.689409 kubelet[2451]: E0116 17:58:02.688586 2451 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://49.12.189.56:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4580-0-0-p-03fd9ab712?timeout=10s\": dial tcp 49.12.189.56:6443: connect: connection refused" interval="200ms" Jan 16 17:58:02.689409 kubelet[2451]: E0116 17:58:02.688728 2451 kubelet.go:1615] "Image garbage collection failed once. Stats initialization may not have completed yet" err="invalid capacity 0 on image filesystem" Jan 16 17:58:02.689409 kubelet[2451]: I0116 17:58:02.688799 2451 reconciler.go:29] "Reconciler: start to sync state" Jan 16 17:58:02.689409 kubelet[2451]: I0116 17:58:02.688827 2451 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 16 17:58:02.689409 kubelet[2451]: E0116 17:58:02.689188 2451 reflector.go:205] "Failed to watch" err="failed to list *v1.CSIDriver: Get \"https://49.12.189.56:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 49.12.189.56:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Jan 16 17:58:02.689933 kubelet[2451]: I0116 17:58:02.689897 2451 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 16 17:58:02.687000 audit[2469]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffecfc3770 a2=0 a3=0 items=0 ppid=2451 pid=2469 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:02.687000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 16 17:58:02.691460 kubelet[2451]: I0116 17:58:02.691427 2451 factory.go:223] Registration of the containerd container factory successfully Jan 16 17:58:02.691460 kubelet[2451]: I0116 17:58:02.691455 2451 factory.go:223] Registration of the systemd container factory successfully Jan 16 17:58:02.691000 audit[2471]: NETFILTER_CFG table=filter:45 family=2 entries=2 op=nft_register_chain pid=2471 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:02.691000 audit[2471]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=340 a0=3 a1=ffffc7f85770 a2=0 a3=0 items=0 ppid=2451 pid=2471 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:02.691000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 16 17:58:02.703000 audit[2476]: NETFILTER_CFG table=filter:46 family=2 entries=1 op=nft_register_rule pid=2476 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:02.703000 audit[2476]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=924 a0=3 a1=ffffd4bd7040 a2=0 a3=0 items=0 ppid=2451 pid=2476 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:02.703000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D41004B5542452D4649524557414C4C002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E7400626C6F636B20696E636F6D696E67206C6F63616C6E657420636F6E6E656374696F6E73002D2D647374003132372E302E302E302F380000002D2D737263003132372E Jan 16 17:58:02.705760 kubelet[2451]: I0116 17:58:02.705720 2451 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Jan 16 17:58:02.705000 audit[2477]: NETFILTER_CFG table=mangle:47 family=10 entries=2 op=nft_register_chain pid=2477 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:02.705000 audit[2477]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=136 a0=3 a1=fffff7d290f0 a2=0 a3=0 items=0 ppid=2451 pid=2477 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:02.705000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D49505441424C45532D48494E54002D74006D616E676C65 Jan 16 17:58:02.707529 kubelet[2451]: I0116 17:58:02.707508 2451 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Jan 16 17:58:02.707652 kubelet[2451]: I0116 17:58:02.707641 2451 status_manager.go:244] "Starting to sync pod status with apiserver" Jan 16 17:58:02.707726 kubelet[2451]: I0116 17:58:02.707717 2451 kubelet.go:2427] "Starting kubelet main sync loop" Jan 16 17:58:02.707818 kubelet[2451]: E0116 17:58:02.707802 2451 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 16 17:58:02.707000 audit[2478]: NETFILTER_CFG table=mangle:48 family=2 entries=1 op=nft_register_chain pid=2478 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:02.707000 audit[2478]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff7942fd0 a2=0 a3=0 items=0 ppid=2451 pid=2478 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:02.707000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 16 17:58:02.709000 audit[2479]: NETFILTER_CFG table=nat:49 family=2 entries=1 op=nft_register_chain pid=2479 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:02.709000 audit[2479]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffdf0936b0 a2=0 a3=0 items=0 ppid=2451 pid=2479 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:02.709000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 16 17:58:02.710000 audit[2480]: NETFILTER_CFG table=mangle:50 family=10 entries=1 op=nft_register_chain pid=2480 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:02.710000 audit[2480]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffceefa4b0 a2=0 a3=0 items=0 ppid=2451 pid=2480 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:02.710000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006D616E676C65 Jan 16 17:58:02.711000 audit[2481]: NETFILTER_CFG table=filter:51 family=2 entries=1 op=nft_register_chain pid=2481 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:02.711000 audit[2481]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffda353f00 a2=0 a3=0 items=0 ppid=2451 pid=2481 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:02.711000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 16 17:58:02.712000 audit[2482]: NETFILTER_CFG table=nat:52 family=10 entries=1 op=nft_register_chain pid=2482 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:02.712000 audit[2482]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd2191a30 a2=0 a3=0 items=0 ppid=2451 pid=2482 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:02.712000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D74006E6174 Jan 16 17:58:02.714343 kubelet[2451]: E0116 17:58:02.714287 2451 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://49.12.189.56:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 49.12.189.56:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Jan 16 17:58:02.715252 kubelet[2451]: I0116 17:58:02.714988 2451 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 16 17:58:02.715252 kubelet[2451]: I0116 17:58:02.715004 2451 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 16 17:58:02.715252 kubelet[2451]: I0116 17:58:02.715021 2451 state_mem.go:36] "Initialized new in-memory state store" Jan 16 17:58:02.713000 audit[2483]: NETFILTER_CFG table=filter:53 family=10 entries=1 op=nft_register_chain pid=2483 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:02.713000 audit[2483]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffd9def140 a2=0 a3=0 items=0 ppid=2451 pid=2483 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/bin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:02.713000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4B5542454C45542D43414E415259002D740066696C746572 Jan 16 17:58:02.716889 kubelet[2451]: I0116 17:58:02.716851 2451 policy_none.go:49] "None policy: Start" Jan 16 17:58:02.716889 kubelet[2451]: I0116 17:58:02.716871 2451 memory_manager.go:187] "Starting memorymanager" policy="None" Jan 16 17:58:02.717013 kubelet[2451]: I0116 17:58:02.716999 2451 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Jan 16 17:58:02.718576 kubelet[2451]: I0116 17:58:02.718233 2451 policy_none.go:47] "Start" Jan 16 17:58:02.725973 systemd[1]: Created slice kubepods.slice - libcontainer container kubepods.slice. Jan 16 17:58:02.742428 systemd[1]: Created slice kubepods-burstable.slice - libcontainer container kubepods-burstable.slice. Jan 16 17:58:02.747428 systemd[1]: Created slice kubepods-besteffort.slice - libcontainer container kubepods-besteffort.slice. Jan 16 17:58:02.757901 kubelet[2451]: E0116 17:58:02.757788 2451 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 16 17:58:02.758370 kubelet[2451]: I0116 17:58:02.758307 2451 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 16 17:58:02.758370 kubelet[2451]: I0116 17:58:02.758325 2451 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 16 17:58:02.759896 kubelet[2451]: I0116 17:58:02.759676 2451 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 16 17:58:02.760766 kubelet[2451]: E0116 17:58:02.760722 2451 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 16 17:58:02.760858 kubelet[2451]: E0116 17:58:02.760803 2451 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ci-4580-0-0-p-03fd9ab712\" not found" Jan 16 17:58:02.821885 systemd[1]: Created slice kubepods-burstable-pod624dcbb593aa4d83fdb641457d2d078c.slice - libcontainer container kubepods-burstable-pod624dcbb593aa4d83fdb641457d2d078c.slice. Jan 16 17:58:02.832438 kubelet[2451]: E0116 17:58:02.832145 2451 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-03fd9ab712\" not found" node="ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:02.837687 systemd[1]: Created slice kubepods-burstable-pod100f46e8d4c0879d4967750549ac1aeb.slice - libcontainer container kubepods-burstable-pod100f46e8d4c0879d4967750549ac1aeb.slice. Jan 16 17:58:02.841885 kubelet[2451]: E0116 17:58:02.841855 2451 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-03fd9ab712\" not found" node="ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:02.844987 systemd[1]: Created slice kubepods-burstable-podc1a7522e6526f34e8d4a0a80704f22a8.slice - libcontainer container kubepods-burstable-podc1a7522e6526f34e8d4a0a80704f22a8.slice. Jan 16 17:58:02.847335 kubelet[2451]: E0116 17:58:02.847308 2451 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-03fd9ab712\" not found" node="ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:02.861608 kubelet[2451]: I0116 17:58:02.861570 2451 kubelet_node_status.go:75] "Attempting to register node" node="ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:02.862018 kubelet[2451]: E0116 17:58:02.861934 2451 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://49.12.189.56:6443/api/v1/nodes\": dial tcp 49.12.189.56:6443: connect: connection refused" node="ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:02.890254 kubelet[2451]: E0116 17:58:02.890126 2451 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://49.12.189.56:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4580-0-0-p-03fd9ab712?timeout=10s\": dial tcp 49.12.189.56:6443: connect: connection refused" interval="400ms" Jan 16 17:58:02.989886 kubelet[2451]: I0116 17:58:02.989736 2451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/624dcbb593aa4d83fdb641457d2d078c-k8s-certs\") pod \"kube-apiserver-ci-4580-0-0-p-03fd9ab712\" (UID: \"624dcbb593aa4d83fdb641457d2d078c\") " pod="kube-system/kube-apiserver-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:02.989886 kubelet[2451]: I0116 17:58:02.989786 2451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/624dcbb593aa4d83fdb641457d2d078c-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4580-0-0-p-03fd9ab712\" (UID: \"624dcbb593aa4d83fdb641457d2d078c\") " pod="kube-system/kube-apiserver-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:02.989886 kubelet[2451]: I0116 17:58:02.989812 2451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/100f46e8d4c0879d4967750549ac1aeb-k8s-certs\") pod \"kube-controller-manager-ci-4580-0-0-p-03fd9ab712\" (UID: \"100f46e8d4c0879d4967750549ac1aeb\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:02.989886 kubelet[2451]: I0116 17:58:02.989830 2451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/100f46e8d4c0879d4967750549ac1aeb-kubeconfig\") pod \"kube-controller-manager-ci-4580-0-0-p-03fd9ab712\" (UID: \"100f46e8d4c0879d4967750549ac1aeb\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:02.989886 kubelet[2451]: I0116 17:58:02.989854 2451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/100f46e8d4c0879d4967750549ac1aeb-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4580-0-0-p-03fd9ab712\" (UID: \"100f46e8d4c0879d4967750549ac1aeb\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:02.990145 kubelet[2451]: I0116 17:58:02.989959 2451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/624dcbb593aa4d83fdb641457d2d078c-ca-certs\") pod \"kube-apiserver-ci-4580-0-0-p-03fd9ab712\" (UID: \"624dcbb593aa4d83fdb641457d2d078c\") " pod="kube-system/kube-apiserver-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:02.990145 kubelet[2451]: I0116 17:58:02.990014 2451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/100f46e8d4c0879d4967750549ac1aeb-ca-certs\") pod \"kube-controller-manager-ci-4580-0-0-p-03fd9ab712\" (UID: \"100f46e8d4c0879d4967750549ac1aeb\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:02.990145 kubelet[2451]: I0116 17:58:02.990089 2451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/100f46e8d4c0879d4967750549ac1aeb-flexvolume-dir\") pod \"kube-controller-manager-ci-4580-0-0-p-03fd9ab712\" (UID: \"100f46e8d4c0879d4967750549ac1aeb\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:02.990215 kubelet[2451]: I0116 17:58:02.990146 2451 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c1a7522e6526f34e8d4a0a80704f22a8-kubeconfig\") pod \"kube-scheduler-ci-4580-0-0-p-03fd9ab712\" (UID: \"c1a7522e6526f34e8d4a0a80704f22a8\") " pod="kube-system/kube-scheduler-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:03.064493 kubelet[2451]: I0116 17:58:03.064449 2451 kubelet_node_status.go:75] "Attempting to register node" node="ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:03.064957 kubelet[2451]: E0116 17:58:03.064913 2451 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://49.12.189.56:6443/api/v1/nodes\": dial tcp 49.12.189.56:6443: connect: connection refused" node="ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:03.136187 containerd[1547]: time="2026-01-16T17:58:03.136110181Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4580-0-0-p-03fd9ab712,Uid:624dcbb593aa4d83fdb641457d2d078c,Namespace:kube-system,Attempt:0,}" Jan 16 17:58:03.145601 containerd[1547]: time="2026-01-16T17:58:03.145527502Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4580-0-0-p-03fd9ab712,Uid:100f46e8d4c0879d4967750549ac1aeb,Namespace:kube-system,Attempt:0,}" Jan 16 17:58:03.150961 containerd[1547]: time="2026-01-16T17:58:03.150699864Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4580-0-0-p-03fd9ab712,Uid:c1a7522e6526f34e8d4a0a80704f22a8,Namespace:kube-system,Attempt:0,}" Jan 16 17:58:03.291192 kubelet[2451]: E0116 17:58:03.291029 2451 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://49.12.189.56:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/ci-4580-0-0-p-03fd9ab712?timeout=10s\": dial tcp 49.12.189.56:6443: connect: connection refused" interval="800ms" Jan 16 17:58:03.467494 kubelet[2451]: I0116 17:58:03.467386 2451 kubelet_node_status.go:75] "Attempting to register node" node="ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:03.468061 kubelet[2451]: E0116 17:58:03.468019 2451 kubelet_node_status.go:107] "Unable to register node with API server" err="Post \"https://49.12.189.56:6443/api/v1/nodes\": dial tcp 49.12.189.56:6443: connect: connection refused" node="ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:03.579489 kubelet[2451]: E0116 17:58:03.579274 2451 reflector.go:205] "Failed to watch" err="failed to list *v1.RuntimeClass: Get \"https://49.12.189.56:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 49.12.189.56:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" Jan 16 17:58:03.626154 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2231187817.mount: Deactivated successfully. Jan 16 17:58:03.633291 containerd[1547]: time="2026-01-16T17:58:03.633221294Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 16 17:58:03.635542 containerd[1547]: time="2026-01-16T17:58:03.635403574Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 16 17:58:03.638921 containerd[1547]: time="2026-01-16T17:58:03.638817564Z" level=info msg="ImageCreate event name:\"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 16 17:58:03.641987 containerd[1547]: time="2026-01-16T17:58:03.641904119Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 16 17:58:03.645582 containerd[1547]: time="2026-01-16T17:58:03.645379250Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 16 17:58:03.649761 containerd[1547]: time="2026-01-16T17:58:03.649710159Z" level=info msg="ImageCreate event name:\"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 16 17:58:03.651160 containerd[1547]: time="2026-01-16T17:58:03.650679256Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 502.788291ms" Jan 16 17:58:03.651346 containerd[1547]: time="2026-01-16T17:58:03.651283267Z" level=info msg="ImageUpdate event name:\"registry.k8s.io/pause:3.10\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"} labels:{key:\"io.cri-containerd.pinned\" value:\"pinned\"}" Jan 16 17:58:03.652916 containerd[1547]: time="2026-01-16T17:58:03.652857615Z" level=info msg="stop pulling image registry.k8s.io/pause:3.10: active requests=0, bytes read=0" Jan 16 17:58:03.656037 containerd[1547]: time="2026-01-16T17:58:03.655977702Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 516.741513ms" Jan 16 17:58:03.671602 containerd[1547]: time="2026-01-16T17:58:03.670610760Z" level=info msg="Pulled image \"registry.k8s.io/pause:3.10\" with image id \"sha256:afb61768ce381961ca0beff95337601f29dc70ff3ed14e5e4b3e5699057e6aa8\", repo tag \"registry.k8s.io/pause:3.10\", repo digest \"registry.k8s.io/pause@sha256:ee6521f290b2168b6e0935a181d4cff9be1ac3f505666ef0e3c98fae8199917a\", size \"267933\" in 517.677798ms" Jan 16 17:58:03.697332 kubelet[2451]: E0116 17:58:03.697280 2451 reflector.go:205] "Failed to watch" err="failed to list *v1.Node: Get \"https://49.12.189.56:6443/api/v1/nodes?fieldSelector=metadata.name%3Dci-4580-0-0-p-03fd9ab712&limit=500&resourceVersion=0\": dial tcp 49.12.189.56:6443: connect: connection refused" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Jan 16 17:58:03.703689 containerd[1547]: time="2026-01-16T17:58:03.703622142Z" level=info msg="connecting to shim 94d4090256e102b4a5214e250eff5e505fc432bb6d2834071eb4209a295f4a11" address="unix:///run/containerd/s/93c92de28a9bcadde16c5fafc224e5e545fe0280eecbb9dd8326357dd6dcddd6" namespace=k8s.io protocol=ttrpc version=3 Jan 16 17:58:03.705361 containerd[1547]: time="2026-01-16T17:58:03.705317892Z" level=info msg="connecting to shim 6b6416c33cc84b3378f0cfb8558edfe27082e098381e82fdd880b0415df36628" address="unix:///run/containerd/s/5ccd6fc87d05c405c26ed0843e098e780bee9dade6470d727245a4a17fd0eae0" namespace=k8s.io protocol=ttrpc version=3 Jan 16 17:58:03.736839 systemd[1]: Started cri-containerd-94d4090256e102b4a5214e250eff5e505fc432bb6d2834071eb4209a295f4a11.scope - libcontainer container 94d4090256e102b4a5214e250eff5e505fc432bb6d2834071eb4209a295f4a11. Jan 16 17:58:03.741880 containerd[1547]: time="2026-01-16T17:58:03.741831454Z" level=info msg="connecting to shim c4bb15695d76fee4985d0eed800aa47b68c6ae10ac6c85f56fbb3359765ed8dc" address="unix:///run/containerd/s/79b2ee8cc5a0ed6ba70233efe6d8bbb222ceb239fb7c3c227b3123fa8b94e484" namespace=k8s.io protocol=ttrpc version=3 Jan 16 17:58:03.742584 systemd[1]: Started cri-containerd-6b6416c33cc84b3378f0cfb8558edfe27082e098381e82fdd880b0415df36628.scope - libcontainer container 6b6416c33cc84b3378f0cfb8558edfe27082e098381e82fdd880b0415df36628. Jan 16 17:58:03.758000 audit: BPF prog-id=81 op=LOAD Jan 16 17:58:03.759000 audit: BPF prog-id=82 op=LOAD Jan 16 17:58:03.759000 audit[2529]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2502 pid=2529 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.759000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3934643430393032353665313032623461353231346532353065666635 Jan 16 17:58:03.759000 audit: BPF prog-id=82 op=UNLOAD Jan 16 17:58:03.759000 audit[2529]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2502 pid=2529 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.759000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3934643430393032353665313032623461353231346532353065666635 Jan 16 17:58:03.760000 audit: BPF prog-id=83 op=LOAD Jan 16 17:58:03.760000 audit[2529]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2502 pid=2529 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.760000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3934643430393032353665313032623461353231346532353065666635 Jan 16 17:58:03.760000 audit: BPF prog-id=84 op=LOAD Jan 16 17:58:03.760000 audit[2529]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2502 pid=2529 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.760000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3934643430393032353665313032623461353231346532353065666635 Jan 16 17:58:03.760000 audit: BPF prog-id=84 op=UNLOAD Jan 16 17:58:03.760000 audit[2529]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2502 pid=2529 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.760000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3934643430393032353665313032623461353231346532353065666635 Jan 16 17:58:03.760000 audit: BPF prog-id=83 op=UNLOAD Jan 16 17:58:03.760000 audit[2529]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2502 pid=2529 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.760000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3934643430393032353665313032623461353231346532353065666635 Jan 16 17:58:03.760000 audit: BPF prog-id=85 op=LOAD Jan 16 17:58:03.760000 audit[2529]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2502 pid=2529 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.760000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3934643430393032353665313032623461353231346532353065666635 Jan 16 17:58:03.767000 audit: BPF prog-id=86 op=LOAD Jan 16 17:58:03.768000 audit: BPF prog-id=87 op=LOAD Jan 16 17:58:03.768000 audit[2531]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2513 pid=2531 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.768000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3662363431366333336363383462333337386630636662383535386564 Jan 16 17:58:03.769000 audit: BPF prog-id=87 op=UNLOAD Jan 16 17:58:03.769000 audit[2531]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2513 pid=2531 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.769000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3662363431366333336363383462333337386630636662383535386564 Jan 16 17:58:03.769000 audit: BPF prog-id=88 op=LOAD Jan 16 17:58:03.769000 audit[2531]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2513 pid=2531 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.769000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3662363431366333336363383462333337386630636662383535386564 Jan 16 17:58:03.770000 audit: BPF prog-id=89 op=LOAD Jan 16 17:58:03.770000 audit[2531]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2513 pid=2531 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.770000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3662363431366333336363383462333337386630636662383535386564 Jan 16 17:58:03.771000 audit: BPF prog-id=89 op=UNLOAD Jan 16 17:58:03.771000 audit[2531]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2513 pid=2531 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.771000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3662363431366333336363383462333337386630636662383535386564 Jan 16 17:58:03.771000 audit: BPF prog-id=88 op=UNLOAD Jan 16 17:58:03.771000 audit[2531]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2513 pid=2531 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.771000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3662363431366333336363383462333337386630636662383535386564 Jan 16 17:58:03.771000 audit: BPF prog-id=90 op=LOAD Jan 16 17:58:03.771000 audit[2531]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2513 pid=2531 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.771000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3662363431366333336363383462333337386630636662383535386564 Jan 16 17:58:03.783608 systemd[1]: Started cri-containerd-c4bb15695d76fee4985d0eed800aa47b68c6ae10ac6c85f56fbb3359765ed8dc.scope - libcontainer container c4bb15695d76fee4985d0eed800aa47b68c6ae10ac6c85f56fbb3359765ed8dc. Jan 16 17:58:03.786694 kubelet[2451]: E0116 17:58:03.785450 2451 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://49.12.189.56:6443/api/v1/namespaces/default/events\": dial tcp 49.12.189.56:6443: connect: connection refused" event="&Event{ObjectMeta:{ci-4580-0-0-p-03fd9ab712.188b47da8d67dbeb default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ci-4580-0-0-p-03fd9ab712,UID:ci-4580-0-0-p-03fd9ab712,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ci-4580-0-0-p-03fd9ab712,},FirstTimestamp:2026-01-16 17:58:02.673888235 +0000 UTC m=+0.604547570,LastTimestamp:2026-01-16 17:58:02.673888235 +0000 UTC m=+0.604547570,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ci-4580-0-0-p-03fd9ab712,}" Jan 16 17:58:03.813000 audit: BPF prog-id=91 op=LOAD Jan 16 17:58:03.816000 audit: BPF prog-id=92 op=LOAD Jan 16 17:58:03.816000 audit[2583]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=2562 pid=2583 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.816000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334626231353639356437366665653439383564306565643830306161 Jan 16 17:58:03.816000 audit: BPF prog-id=92 op=UNLOAD Jan 16 17:58:03.816000 audit[2583]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2562 pid=2583 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.816000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334626231353639356437366665653439383564306565643830306161 Jan 16 17:58:03.817000 audit: BPF prog-id=93 op=LOAD Jan 16 17:58:03.817000 audit[2583]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=2562 pid=2583 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.817000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334626231353639356437366665653439383564306565643830306161 Jan 16 17:58:03.817000 audit: BPF prog-id=94 op=LOAD Jan 16 17:58:03.817000 audit[2583]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=2562 pid=2583 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.817000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334626231353639356437366665653439383564306565643830306161 Jan 16 17:58:03.817000 audit: BPF prog-id=94 op=UNLOAD Jan 16 17:58:03.817000 audit[2583]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2562 pid=2583 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.817000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334626231353639356437366665653439383564306565643830306161 Jan 16 17:58:03.817000 audit: BPF prog-id=93 op=UNLOAD Jan 16 17:58:03.817000 audit[2583]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2562 pid=2583 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.817000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334626231353639356437366665653439383564306565643830306161 Jan 16 17:58:03.817000 audit: BPF prog-id=95 op=LOAD Jan 16 17:58:03.817000 audit[2583]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=2562 pid=2583 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.817000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6334626231353639356437366665653439383564306565643830306161 Jan 16 17:58:03.822575 containerd[1547]: time="2026-01-16T17:58:03.821481244Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-apiserver-ci-4580-0-0-p-03fd9ab712,Uid:624dcbb593aa4d83fdb641457d2d078c,Namespace:kube-system,Attempt:0,} returns sandbox id \"6b6416c33cc84b3378f0cfb8558edfe27082e098381e82fdd880b0415df36628\"" Jan 16 17:58:03.822707 containerd[1547]: time="2026-01-16T17:58:03.822682902Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-controller-manager-ci-4580-0-0-p-03fd9ab712,Uid:100f46e8d4c0879d4967750549ac1aeb,Namespace:kube-system,Attempt:0,} returns sandbox id \"94d4090256e102b4a5214e250eff5e505fc432bb6d2834071eb4209a295f4a11\"" Jan 16 17:58:03.832887 containerd[1547]: time="2026-01-16T17:58:03.832753051Z" level=info msg="CreateContainer within sandbox \"94d4090256e102b4a5214e250eff5e505fc432bb6d2834071eb4209a295f4a11\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:0,}" Jan 16 17:58:03.833055 containerd[1547]: time="2026-01-16T17:58:03.832944437Z" level=info msg="CreateContainer within sandbox \"6b6416c33cc84b3378f0cfb8558edfe27082e098381e82fdd880b0415df36628\" for container &ContainerMetadata{Name:kube-apiserver,Attempt:0,}" Jan 16 17:58:03.847854 containerd[1547]: time="2026-01-16T17:58:03.847807976Z" level=info msg="Container f0fdedc89d7e27154ad2d04835c01ba181c0c01a75589e5f18ecf1f7f26e605f: CDI devices from CRI Config.CDIDevices: []" Jan 16 17:58:03.854155 containerd[1547]: time="2026-01-16T17:58:03.854076120Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-scheduler-ci-4580-0-0-p-03fd9ab712,Uid:c1a7522e6526f34e8d4a0a80704f22a8,Namespace:kube-system,Attempt:0,} returns sandbox id \"c4bb15695d76fee4985d0eed800aa47b68c6ae10ac6c85f56fbb3359765ed8dc\"" Jan 16 17:58:03.855952 containerd[1547]: time="2026-01-16T17:58:03.855848857Z" level=info msg="Container 9d3e8a1a28e2574068b7995e00eed97559452cf548935f836e157674261fedfe: CDI devices from CRI Config.CDIDevices: []" Jan 16 17:58:03.861643 containerd[1547]: time="2026-01-16T17:58:03.861466294Z" level=info msg="CreateContainer within sandbox \"c4bb15695d76fee4985d0eed800aa47b68c6ae10ac6c85f56fbb3359765ed8dc\" for container &ContainerMetadata{Name:kube-scheduler,Attempt:0,}" Jan 16 17:58:03.861724 containerd[1547]: time="2026-01-16T17:58:03.861675087Z" level=info msg="CreateContainer within sandbox \"94d4090256e102b4a5214e250eff5e505fc432bb6d2834071eb4209a295f4a11\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:0,} returns container id \"f0fdedc89d7e27154ad2d04835c01ba181c0c01a75589e5f18ecf1f7f26e605f\"" Jan 16 17:58:03.862933 containerd[1547]: time="2026-01-16T17:58:03.862906476Z" level=info msg="StartContainer for \"f0fdedc89d7e27154ad2d04835c01ba181c0c01a75589e5f18ecf1f7f26e605f\"" Jan 16 17:58:03.864135 containerd[1547]: time="2026-01-16T17:58:03.864103533Z" level=info msg="connecting to shim f0fdedc89d7e27154ad2d04835c01ba181c0c01a75589e5f18ecf1f7f26e605f" address="unix:///run/containerd/s/93c92de28a9bcadde16c5fafc224e5e545fe0280eecbb9dd8326357dd6dcddd6" protocol=ttrpc version=3 Jan 16 17:58:03.865660 containerd[1547]: time="2026-01-16T17:58:03.865061987Z" level=info msg="CreateContainer within sandbox \"6b6416c33cc84b3378f0cfb8558edfe27082e098381e82fdd880b0415df36628\" for &ContainerMetadata{Name:kube-apiserver,Attempt:0,} returns container id \"9d3e8a1a28e2574068b7995e00eed97559452cf548935f836e157674261fedfe\"" Jan 16 17:58:03.866620 containerd[1547]: time="2026-01-16T17:58:03.866157689Z" level=info msg="StartContainer for \"9d3e8a1a28e2574068b7995e00eed97559452cf548935f836e157674261fedfe\"" Jan 16 17:58:03.867295 containerd[1547]: time="2026-01-16T17:58:03.867261793Z" level=info msg="connecting to shim 9d3e8a1a28e2574068b7995e00eed97559452cf548935f836e157674261fedfe" address="unix:///run/containerd/s/5ccd6fc87d05c405c26ed0843e098e780bee9dade6470d727245a4a17fd0eae0" protocol=ttrpc version=3 Jan 16 17:58:03.878145 containerd[1547]: time="2026-01-16T17:58:03.878108612Z" level=info msg="Container a379306ca43099d210df2482973aa85a8fb4d26d8b549e90f591e16eb865c80b: CDI devices from CRI Config.CDIDevices: []" Jan 16 17:58:03.888835 containerd[1547]: time="2026-01-16T17:58:03.888787253Z" level=info msg="CreateContainer within sandbox \"c4bb15695d76fee4985d0eed800aa47b68c6ae10ac6c85f56fbb3359765ed8dc\" for &ContainerMetadata{Name:kube-scheduler,Attempt:0,} returns container id \"a379306ca43099d210df2482973aa85a8fb4d26d8b549e90f591e16eb865c80b\"" Jan 16 17:58:03.890795 containerd[1547]: time="2026-01-16T17:58:03.890665587Z" level=info msg="StartContainer for \"a379306ca43099d210df2482973aa85a8fb4d26d8b549e90f591e16eb865c80b\"" Jan 16 17:58:03.894407 containerd[1547]: time="2026-01-16T17:58:03.894315499Z" level=info msg="connecting to shim a379306ca43099d210df2482973aa85a8fb4d26d8b549e90f591e16eb865c80b" address="unix:///run/containerd/s/79b2ee8cc5a0ed6ba70233efe6d8bbb222ceb239fb7c3c227b3123fa8b94e484" protocol=ttrpc version=3 Jan 16 17:58:03.899830 systemd[1]: Started cri-containerd-9d3e8a1a28e2574068b7995e00eed97559452cf548935f836e157674261fedfe.scope - libcontainer container 9d3e8a1a28e2574068b7995e00eed97559452cf548935f836e157674261fedfe. Jan 16 17:58:03.903165 systemd[1]: Started cri-containerd-f0fdedc89d7e27154ad2d04835c01ba181c0c01a75589e5f18ecf1f7f26e605f.scope - libcontainer container f0fdedc89d7e27154ad2d04835c01ba181c0c01a75589e5f18ecf1f7f26e605f. Jan 16 17:58:03.925758 systemd[1]: Started cri-containerd-a379306ca43099d210df2482973aa85a8fb4d26d8b549e90f591e16eb865c80b.scope - libcontainer container a379306ca43099d210df2482973aa85a8fb4d26d8b549e90f591e16eb865c80b. Jan 16 17:58:03.927000 audit: BPF prog-id=96 op=LOAD Jan 16 17:58:03.928000 audit: BPF prog-id=97 op=LOAD Jan 16 17:58:03.928000 audit[2631]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2513 pid=2631 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.928000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3964336538613161323865323537343036386237393935653030656564 Jan 16 17:58:03.929000 audit: BPF prog-id=97 op=UNLOAD Jan 16 17:58:03.929000 audit[2631]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2513 pid=2631 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.929000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3964336538613161323865323537343036386237393935653030656564 Jan 16 17:58:03.930000 audit: BPF prog-id=98 op=LOAD Jan 16 17:58:03.930000 audit[2631]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2513 pid=2631 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.930000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3964336538613161323865323537343036386237393935653030656564 Jan 16 17:58:03.930000 audit: BPF prog-id=99 op=LOAD Jan 16 17:58:03.930000 audit[2631]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2513 pid=2631 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.930000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3964336538613161323865323537343036386237393935653030656564 Jan 16 17:58:03.930000 audit: BPF prog-id=99 op=UNLOAD Jan 16 17:58:03.930000 audit[2631]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2513 pid=2631 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.930000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3964336538613161323865323537343036386237393935653030656564 Jan 16 17:58:03.930000 audit: BPF prog-id=98 op=UNLOAD Jan 16 17:58:03.930000 audit[2631]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2513 pid=2631 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.930000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3964336538613161323865323537343036386237393935653030656564 Jan 16 17:58:03.930000 audit: BPF prog-id=100 op=LOAD Jan 16 17:58:03.930000 audit[2631]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2513 pid=2631 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.930000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3964336538613161323865323537343036386237393935653030656564 Jan 16 17:58:03.931000 audit: BPF prog-id=101 op=LOAD Jan 16 17:58:03.932000 audit: BPF prog-id=102 op=LOAD Jan 16 17:58:03.932000 audit[2629]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2502 pid=2629 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.932000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6630666465646338396437653237313534616432643034383335633031 Jan 16 17:58:03.932000 audit: BPF prog-id=102 op=UNLOAD Jan 16 17:58:03.932000 audit[2629]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2502 pid=2629 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.932000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6630666465646338396437653237313534616432643034383335633031 Jan 16 17:58:03.933000 audit: BPF prog-id=103 op=LOAD Jan 16 17:58:03.933000 audit[2629]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2502 pid=2629 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.933000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6630666465646338396437653237313534616432643034383335633031 Jan 16 17:58:03.935000 audit: BPF prog-id=104 op=LOAD Jan 16 17:58:03.935000 audit[2629]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2502 pid=2629 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.935000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6630666465646338396437653237313534616432643034383335633031 Jan 16 17:58:03.935000 audit: BPF prog-id=104 op=UNLOAD Jan 16 17:58:03.935000 audit[2629]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2502 pid=2629 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.935000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6630666465646338396437653237313534616432643034383335633031 Jan 16 17:58:03.935000 audit: BPF prog-id=103 op=UNLOAD Jan 16 17:58:03.935000 audit[2629]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2502 pid=2629 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.935000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6630666465646338396437653237313534616432643034383335633031 Jan 16 17:58:03.935000 audit: BPF prog-id=105 op=LOAD Jan 16 17:58:03.935000 audit[2629]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2502 pid=2629 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.935000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6630666465646338396437653237313534616432643034383335633031 Jan 16 17:58:03.960000 audit: BPF prog-id=106 op=LOAD Jan 16 17:58:03.965000 audit: BPF prog-id=107 op=LOAD Jan 16 17:58:03.965000 audit[2653]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0180 a2=98 a3=0 items=0 ppid=2562 pid=2653 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.965000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6133373933303663613433303939643231306466323438323937336161 Jan 16 17:58:03.965000 audit: BPF prog-id=107 op=UNLOAD Jan 16 17:58:03.965000 audit[2653]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2562 pid=2653 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.965000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6133373933303663613433303939643231306466323438323937336161 Jan 16 17:58:03.966000 audit: BPF prog-id=108 op=LOAD Jan 16 17:58:03.966000 audit[2653]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b03e8 a2=98 a3=0 items=0 ppid=2562 pid=2653 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.966000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6133373933303663613433303939643231306466323438323937336161 Jan 16 17:58:03.967000 audit: BPF prog-id=109 op=LOAD Jan 16 17:58:03.967000 audit[2653]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001b0168 a2=98 a3=0 items=0 ppid=2562 pid=2653 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.967000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6133373933303663613433303939643231306466323438323937336161 Jan 16 17:58:03.967000 audit: BPF prog-id=109 op=UNLOAD Jan 16 17:58:03.967000 audit[2653]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2562 pid=2653 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.967000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6133373933303663613433303939643231306466323438323937336161 Jan 16 17:58:03.967000 audit: BPF prog-id=108 op=UNLOAD Jan 16 17:58:03.967000 audit[2653]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2562 pid=2653 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.967000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6133373933303663613433303939643231306466323438323937336161 Jan 16 17:58:03.967000 audit: BPF prog-id=110 op=LOAD Jan 16 17:58:03.967000 audit[2653]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0648 a2=98 a3=0 items=0 ppid=2562 pid=2653 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:03.967000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6133373933303663613433303939643231306466323438323937336161 Jan 16 17:58:03.984116 containerd[1547]: time="2026-01-16T17:58:03.984009308Z" level=info msg="StartContainer for \"9d3e8a1a28e2574068b7995e00eed97559452cf548935f836e157674261fedfe\" returns successfully" Jan 16 17:58:03.998622 containerd[1547]: time="2026-01-16T17:58:03.997329429Z" level=info msg="StartContainer for \"f0fdedc89d7e27154ad2d04835c01ba181c0c01a75589e5f18ecf1f7f26e605f\" returns successfully" Jan 16 17:58:04.026714 containerd[1547]: time="2026-01-16T17:58:04.026678072Z" level=info msg="StartContainer for \"a379306ca43099d210df2482973aa85a8fb4d26d8b549e90f591e16eb865c80b\" returns successfully" Jan 16 17:58:04.270264 kubelet[2451]: I0116 17:58:04.270228 2451 kubelet_node_status.go:75] "Attempting to register node" node="ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:04.725432 kubelet[2451]: E0116 17:58:04.725360 2451 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-03fd9ab712\" not found" node="ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:04.729621 kubelet[2451]: E0116 17:58:04.729541 2451 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-03fd9ab712\" not found" node="ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:04.731041 kubelet[2451]: E0116 17:58:04.731013 2451 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-03fd9ab712\" not found" node="ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:05.732324 kubelet[2451]: E0116 17:58:05.731791 2451 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-03fd9ab712\" not found" node="ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:05.732324 kubelet[2451]: E0116 17:58:05.732190 2451 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-03fd9ab712\" not found" node="ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:05.733245 kubelet[2451]: E0116 17:58:05.733227 2451 kubelet.go:3215] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ci-4580-0-0-p-03fd9ab712\" not found" node="ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:05.830002 kubelet[2451]: E0116 17:58:05.829970 2451 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ci-4580-0-0-p-03fd9ab712\" not found" node="ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:06.013569 kubelet[2451]: I0116 17:58:06.013301 2451 kubelet_node_status.go:78] "Successfully registered node" node="ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:06.013569 kubelet[2451]: E0116 17:58:06.013343 2451 kubelet_node_status.go:486] "Error updating node status, will retry" err="error getting node \"ci-4580-0-0-p-03fd9ab712\": node \"ci-4580-0-0-p-03fd9ab712\" not found" Jan 16 17:58:06.088642 kubelet[2451]: I0116 17:58:06.088600 2451 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:06.098861 kubelet[2451]: E0116 17:58:06.098820 2451 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4580-0-0-p-03fd9ab712\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-apiserver-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:06.098861 kubelet[2451]: I0116 17:58:06.098852 2451 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:06.102409 kubelet[2451]: E0116 17:58:06.102375 2451 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-controller-manager-ci-4580-0-0-p-03fd9ab712\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-controller-manager-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:06.102409 kubelet[2451]: I0116 17:58:06.102404 2451 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:06.107686 kubelet[2451]: E0116 17:58:06.107649 2451 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4580-0-0-p-03fd9ab712\" is forbidden: no PriorityClass with name system-node-critical was found" pod="kube-system/kube-scheduler-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:06.670502 kubelet[2451]: I0116 17:58:06.670461 2451 apiserver.go:52] "Watching apiserver" Jan 16 17:58:06.689708 kubelet[2451]: I0116 17:58:06.689674 2451 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 16 17:58:06.735586 kubelet[2451]: I0116 17:58:06.733633 2451 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:08.020629 systemd[1]: Reload requested from client PID 2733 ('systemctl') (unit session-8.scope)... Jan 16 17:58:08.020651 systemd[1]: Reloading... Jan 16 17:58:08.122577 zram_generator::config[2780]: No configuration found. Jan 16 17:58:08.356712 systemd[1]: Reloading finished in 335 ms. Jan 16 17:58:08.390466 systemd[1]: Stopping kubelet.service - kubelet: The Kubernetes Node Agent... Jan 16 17:58:08.409356 systemd[1]: kubelet.service: Deactivated successfully. Jan 16 17:58:08.410014 systemd[1]: Stopped kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 17:58:08.410131 systemd[1]: kubelet.service: Consumed 1.011s CPU time, 121.1M memory peak. Jan 16 17:58:08.409000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:08.412173 kernel: kauditd_printk_skb: 202 callbacks suppressed Jan 16 17:58:08.412307 kernel: audit: type=1131 audit(1768586288.409:387): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:08.417496 systemd[1]: Starting kubelet.service - kubelet: The Kubernetes Node Agent... Jan 16 17:58:08.418000 audit: BPF prog-id=111 op=LOAD Jan 16 17:58:08.418000 audit: BPF prog-id=67 op=UNLOAD Jan 16 17:58:08.422125 kernel: audit: type=1334 audit(1768586288.418:388): prog-id=111 op=LOAD Jan 16 17:58:08.422215 kernel: audit: type=1334 audit(1768586288.418:389): prog-id=67 op=UNLOAD Jan 16 17:58:08.421000 audit: BPF prog-id=112 op=LOAD Jan 16 17:58:08.421000 audit: BPF prog-id=113 op=LOAD Jan 16 17:58:08.421000 audit: BPF prog-id=68 op=UNLOAD Jan 16 17:58:08.421000 audit: BPF prog-id=69 op=UNLOAD Jan 16 17:58:08.422000 audit: BPF prog-id=114 op=LOAD Jan 16 17:58:08.422000 audit: BPF prog-id=61 op=UNLOAD Jan 16 17:58:08.422000 audit: BPF prog-id=115 op=LOAD Jan 16 17:58:08.422000 audit: BPF prog-id=116 op=LOAD Jan 16 17:58:08.422000 audit: BPF prog-id=62 op=UNLOAD Jan 16 17:58:08.422000 audit: BPF prog-id=63 op=UNLOAD Jan 16 17:58:08.423000 audit: BPF prog-id=117 op=LOAD Jan 16 17:58:08.423000 audit: BPF prog-id=66 op=UNLOAD Jan 16 17:58:08.426037 kernel: audit: type=1334 audit(1768586288.421:390): prog-id=112 op=LOAD Jan 16 17:58:08.426086 kernel: audit: type=1334 audit(1768586288.421:391): prog-id=113 op=LOAD Jan 16 17:58:08.426107 kernel: audit: type=1334 audit(1768586288.421:392): prog-id=68 op=UNLOAD Jan 16 17:58:08.426130 kernel: audit: type=1334 audit(1768586288.421:393): prog-id=69 op=UNLOAD Jan 16 17:58:08.426150 kernel: audit: type=1334 audit(1768586288.422:394): prog-id=114 op=LOAD Jan 16 17:58:08.426167 kernel: audit: type=1334 audit(1768586288.422:395): prog-id=61 op=UNLOAD Jan 16 17:58:08.426186 kernel: audit: type=1334 audit(1768586288.422:396): prog-id=115 op=LOAD Jan 16 17:58:08.425000 audit: BPF prog-id=118 op=LOAD Jan 16 17:58:08.425000 audit: BPF prog-id=73 op=UNLOAD Jan 16 17:58:08.426000 audit: BPF prog-id=119 op=LOAD Jan 16 17:58:08.426000 audit: BPF prog-id=120 op=LOAD Jan 16 17:58:08.426000 audit: BPF prog-id=74 op=UNLOAD Jan 16 17:58:08.426000 audit: BPF prog-id=75 op=UNLOAD Jan 16 17:58:08.427000 audit: BPF prog-id=121 op=LOAD Jan 16 17:58:08.427000 audit: BPF prog-id=122 op=LOAD Jan 16 17:58:08.427000 audit: BPF prog-id=64 op=UNLOAD Jan 16 17:58:08.427000 audit: BPF prog-id=65 op=UNLOAD Jan 16 17:58:08.428000 audit: BPF prog-id=123 op=LOAD Jan 16 17:58:08.428000 audit: BPF prog-id=76 op=UNLOAD Jan 16 17:58:08.434000 audit: BPF prog-id=124 op=LOAD Jan 16 17:58:08.435000 audit: BPF prog-id=80 op=UNLOAD Jan 16 17:58:08.437000 audit: BPF prog-id=125 op=LOAD Jan 16 17:58:08.438000 audit: BPF prog-id=77 op=UNLOAD Jan 16 17:58:08.438000 audit: BPF prog-id=126 op=LOAD Jan 16 17:58:08.438000 audit: BPF prog-id=127 op=LOAD Jan 16 17:58:08.438000 audit: BPF prog-id=78 op=UNLOAD Jan 16 17:58:08.438000 audit: BPF prog-id=79 op=UNLOAD Jan 16 17:58:08.439000 audit: BPF prog-id=128 op=LOAD Jan 16 17:58:08.439000 audit: BPF prog-id=70 op=UNLOAD Jan 16 17:58:08.439000 audit: BPF prog-id=129 op=LOAD Jan 16 17:58:08.439000 audit: BPF prog-id=130 op=LOAD Jan 16 17:58:08.439000 audit: BPF prog-id=71 op=UNLOAD Jan 16 17:58:08.439000 audit: BPF prog-id=72 op=UNLOAD Jan 16 17:58:08.590205 systemd[1]: Started kubelet.service - kubelet: The Kubernetes Node Agent. Jan 16 17:58:08.590000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=kubelet comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:08.606858 (kubelet)[2825]: kubelet.service: Referenced but unset environment variable evaluates to an empty string: KUBELET_EXTRA_ARGS Jan 16 17:58:08.662230 kubelet[2825]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Jan 16 17:58:08.662230 kubelet[2825]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 16 17:58:08.663117 kubelet[2825]: I0116 17:58:08.662417 2825 server.go:213] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 16 17:58:08.671140 kubelet[2825]: I0116 17:58:08.671100 2825 server.go:529] "Kubelet version" kubeletVersion="v1.34.1" Jan 16 17:58:08.671140 kubelet[2825]: I0116 17:58:08.671131 2825 server.go:531] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 16 17:58:08.671378 kubelet[2825]: I0116 17:58:08.671162 2825 watchdog_linux.go:95] "Systemd watchdog is not enabled" Jan 16 17:58:08.671378 kubelet[2825]: I0116 17:58:08.671168 2825 watchdog_linux.go:137] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Jan 16 17:58:08.671447 kubelet[2825]: I0116 17:58:08.671403 2825 server.go:956] "Client rotation is on, will bootstrap in background" Jan 16 17:58:08.673460 kubelet[2825]: I0116 17:58:08.673416 2825 certificate_store.go:147] "Loading cert/key pair from a file" filePath="/var/lib/kubelet/pki/kubelet-client-current.pem" Jan 16 17:58:08.676797 kubelet[2825]: I0116 17:58:08.676775 2825 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/pki/ca.crt" Jan 16 17:58:08.680617 kubelet[2825]: I0116 17:58:08.680598 2825 server.go:1423] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 16 17:58:08.687089 kubelet[2825]: I0116 17:58:08.686215 2825 server.go:781] "--cgroups-per-qos enabled, but --cgroup-root was not specified. Defaulting to /" Jan 16 17:58:08.687089 kubelet[2825]: I0116 17:58:08.686397 2825 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 16 17:58:08.687089 kubelet[2825]: I0116 17:58:08.686421 2825 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ci-4580-0-0-p-03fd9ab712","RuntimeCgroupsName":"","SystemCgroupsName":"","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":false,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":null,"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":-1,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 16 17:58:08.687089 kubelet[2825]: I0116 17:58:08.686667 2825 topology_manager.go:138] "Creating topology manager with none policy" Jan 16 17:58:08.687412 kubelet[2825]: I0116 17:58:08.686676 2825 container_manager_linux.go:306] "Creating device plugin manager" Jan 16 17:58:08.687412 kubelet[2825]: I0116 17:58:08.686704 2825 container_manager_linux.go:315] "Creating Dynamic Resource Allocation (DRA) manager" Jan 16 17:58:08.688014 kubelet[2825]: I0116 17:58:08.687985 2825 state_mem.go:36] "Initialized new in-memory state store" Jan 16 17:58:08.688152 kubelet[2825]: I0116 17:58:08.688132 2825 kubelet.go:475] "Attempting to sync node with API server" Jan 16 17:58:08.688152 kubelet[2825]: I0116 17:58:08.688151 2825 kubelet.go:376] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 16 17:58:08.688240 kubelet[2825]: I0116 17:58:08.688172 2825 kubelet.go:387] "Adding apiserver pod source" Jan 16 17:58:08.688240 kubelet[2825]: I0116 17:58:08.688186 2825 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 16 17:58:08.691423 kubelet[2825]: I0116 17:58:08.691229 2825 kuberuntime_manager.go:291] "Container runtime initialized" containerRuntime="containerd" version="v2.1.5" apiVersion="v1" Jan 16 17:58:08.692933 kubelet[2825]: I0116 17:58:08.692149 2825 kubelet.go:940] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Jan 16 17:58:08.693398 kubelet[2825]: I0116 17:58:08.693384 2825 kubelet.go:964] "Not starting PodCertificateRequest manager because we are in static kubelet mode or the PodCertificateProjection feature gate is disabled" Jan 16 17:58:08.696572 kubelet[2825]: I0116 17:58:08.696484 2825 server.go:1262] "Started kubelet" Jan 16 17:58:08.698067 kubelet[2825]: I0116 17:58:08.697695 2825 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 16 17:58:08.702246 kubelet[2825]: I0116 17:58:08.702203 2825 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Jan 16 17:58:08.703081 kubelet[2825]: I0116 17:58:08.703059 2825 server.go:310] "Adding debug handlers to kubelet server" Jan 16 17:58:08.708190 kubelet[2825]: I0116 17:58:08.708134 2825 ratelimit.go:56] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 16 17:58:08.708273 kubelet[2825]: I0116 17:58:08.708207 2825 server_v1.go:49] "podresources" method="list" useActivePods=true Jan 16 17:58:08.708401 kubelet[2825]: I0116 17:58:08.708382 2825 server.go:249] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 16 17:58:08.710771 kubelet[2825]: I0116 17:58:08.710732 2825 volume_manager.go:313] "Starting Kubelet Volume Manager" Jan 16 17:58:08.710981 kubelet[2825]: E0116 17:58:08.710937 2825 kubelet_node_status.go:404] "Error getting the current node from lister" err="node \"ci-4580-0-0-p-03fd9ab712\" not found" Jan 16 17:58:08.711298 kubelet[2825]: I0116 17:58:08.711227 2825 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 16 17:58:08.711366 kubelet[2825]: I0116 17:58:08.711344 2825 reconciler.go:29] "Reconciler: start to sync state" Jan 16 17:58:08.720594 kubelet[2825]: I0116 17:58:08.719977 2825 dynamic_serving_content.go:135] "Starting controller" name="kubelet-server-cert-files::/var/lib/kubelet/pki/kubelet.crt::/var/lib/kubelet/pki/kubelet.key" Jan 16 17:58:08.732584 kubelet[2825]: I0116 17:58:08.732216 2825 factory.go:223] Registration of the systemd container factory successfully Jan 16 17:58:08.732584 kubelet[2825]: I0116 17:58:08.732332 2825 factory.go:221] Registration of the crio container factory failed: Get "http://%2Fvar%2Frun%2Fcrio%2Fcrio.sock/info": dial unix /var/run/crio/crio.sock: connect: no such file or directory Jan 16 17:58:08.736457 kubelet[2825]: I0116 17:58:08.736380 2825 factory.go:223] Registration of the containerd container factory successfully Jan 16 17:58:08.743432 kubelet[2825]: I0116 17:58:08.743337 2825 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv4" Jan 16 17:58:08.744834 kubelet[2825]: I0116 17:58:08.744783 2825 kubelet_network_linux.go:54] "Initialized iptables rules." protocol="IPv6" Jan 16 17:58:08.744834 kubelet[2825]: I0116 17:58:08.744803 2825 status_manager.go:244] "Starting to sync pod status with apiserver" Jan 16 17:58:08.744834 kubelet[2825]: I0116 17:58:08.744825 2825 kubelet.go:2427] "Starting kubelet main sync loop" Jan 16 17:58:08.744974 kubelet[2825]: E0116 17:58:08.744859 2825 kubelet.go:2451] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 16 17:58:08.803819 kubelet[2825]: I0116 17:58:08.803777 2825 cpu_manager.go:221] "Starting CPU manager" policy="none" Jan 16 17:58:08.804238 kubelet[2825]: I0116 17:58:08.804110 2825 cpu_manager.go:222] "Reconciling" reconcilePeriod="10s" Jan 16 17:58:08.805540 kubelet[2825]: I0116 17:58:08.804150 2825 state_mem.go:36] "Initialized new in-memory state store" Jan 16 17:58:08.805540 kubelet[2825]: I0116 17:58:08.804738 2825 state_mem.go:88] "Updated default CPUSet" cpuSet="" Jan 16 17:58:08.805540 kubelet[2825]: I0116 17:58:08.804750 2825 state_mem.go:96] "Updated CPUSet assignments" assignments={} Jan 16 17:58:08.805540 kubelet[2825]: I0116 17:58:08.804769 2825 policy_none.go:49] "None policy: Start" Jan 16 17:58:08.805540 kubelet[2825]: I0116 17:58:08.804780 2825 memory_manager.go:187] "Starting memorymanager" policy="None" Jan 16 17:58:08.805540 kubelet[2825]: I0116 17:58:08.804790 2825 state_mem.go:36] "Initializing new in-memory state store" logger="Memory Manager state checkpoint" Jan 16 17:58:08.805540 kubelet[2825]: I0116 17:58:08.804888 2825 state_mem.go:77] "Updated machine memory state" logger="Memory Manager state checkpoint" Jan 16 17:58:08.805540 kubelet[2825]: I0116 17:58:08.804896 2825 policy_none.go:47] "Start" Jan 16 17:58:08.811947 kubelet[2825]: E0116 17:58:08.811908 2825 manager.go:513] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Jan 16 17:58:08.812685 kubelet[2825]: I0116 17:58:08.812663 2825 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 16 17:58:08.812861 kubelet[2825]: I0116 17:58:08.812818 2825 container_log_manager.go:146] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 16 17:58:08.813246 kubelet[2825]: I0116 17:58:08.813226 2825 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 16 17:58:08.819514 kubelet[2825]: E0116 17:58:08.819470 2825 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="no imagefs label for configured runtime" Jan 16 17:58:08.846635 kubelet[2825]: I0116 17:58:08.846587 2825 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-scheduler-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:08.847536 kubelet[2825]: I0116 17:58:08.847482 2825 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-controller-manager-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:08.847846 kubelet[2825]: I0116 17:58:08.847244 2825 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:08.856751 kubelet[2825]: E0116 17:58:08.856650 2825 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-scheduler-ci-4580-0-0-p-03fd9ab712\" already exists" pod="kube-system/kube-scheduler-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:08.913707 kubelet[2825]: I0116 17:58:08.912540 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/624dcbb593aa4d83fdb641457d2d078c-ca-certs\") pod \"kube-apiserver-ci-4580-0-0-p-03fd9ab712\" (UID: \"624dcbb593aa4d83fdb641457d2d078c\") " pod="kube-system/kube-apiserver-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:08.913845 kubelet[2825]: I0116 17:58:08.913782 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/624dcbb593aa4d83fdb641457d2d078c-k8s-certs\") pod \"kube-apiserver-ci-4580-0-0-p-03fd9ab712\" (UID: \"624dcbb593aa4d83fdb641457d2d078c\") " pod="kube-system/kube-apiserver-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:08.913888 kubelet[2825]: I0116 17:58:08.913856 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-certs\" (UniqueName: \"kubernetes.io/host-path/100f46e8d4c0879d4967750549ac1aeb-ca-certs\") pod \"kube-controller-manager-ci-4580-0-0-p-03fd9ab712\" (UID: \"100f46e8d4c0879d4967750549ac1aeb\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:08.913926 kubelet[2825]: I0116 17:58:08.913887 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvolume-dir\" (UniqueName: \"kubernetes.io/host-path/100f46e8d4c0879d4967750549ac1aeb-flexvolume-dir\") pod \"kube-controller-manager-ci-4580-0-0-p-03fd9ab712\" (UID: \"100f46e8d4c0879d4967750549ac1aeb\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:08.913974 kubelet[2825]: I0116 17:58:08.913958 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"k8s-certs\" (UniqueName: \"kubernetes.io/host-path/100f46e8d4c0879d4967750549ac1aeb-k8s-certs\") pod \"kube-controller-manager-ci-4580-0-0-p-03fd9ab712\" (UID: \"100f46e8d4c0879d4967750549ac1aeb\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:08.914175 kubelet[2825]: I0116 17:58:08.914104 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/100f46e8d4c0879d4967750549ac1aeb-usr-share-ca-certificates\") pod \"kube-controller-manager-ci-4580-0-0-p-03fd9ab712\" (UID: \"100f46e8d4c0879d4967750549ac1aeb\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:08.914237 kubelet[2825]: I0116 17:58:08.914205 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/c1a7522e6526f34e8d4a0a80704f22a8-kubeconfig\") pod \"kube-scheduler-ci-4580-0-0-p-03fd9ab712\" (UID: \"c1a7522e6526f34e8d4a0a80704f22a8\") " pod="kube-system/kube-scheduler-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:08.914237 kubelet[2825]: I0116 17:58:08.914226 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-share-ca-certificates\" (UniqueName: \"kubernetes.io/host-path/624dcbb593aa4d83fdb641457d2d078c-usr-share-ca-certificates\") pod \"kube-apiserver-ci-4580-0-0-p-03fd9ab712\" (UID: \"624dcbb593aa4d83fdb641457d2d078c\") " pod="kube-system/kube-apiserver-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:08.914316 kubelet[2825]: I0116 17:58:08.914274 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubeconfig\" (UniqueName: \"kubernetes.io/host-path/100f46e8d4c0879d4967750549ac1aeb-kubeconfig\") pod \"kube-controller-manager-ci-4580-0-0-p-03fd9ab712\" (UID: \"100f46e8d4c0879d4967750549ac1aeb\") " pod="kube-system/kube-controller-manager-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:08.923262 kubelet[2825]: I0116 17:58:08.923152 2825 kubelet_node_status.go:75] "Attempting to register node" node="ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:08.938213 kubelet[2825]: I0116 17:58:08.937973 2825 kubelet_node_status.go:124] "Node was previously registered" node="ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:08.938213 kubelet[2825]: I0116 17:58:08.938076 2825 kubelet_node_status.go:78] "Successfully registered node" node="ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:09.689764 kubelet[2825]: I0116 17:58:09.689630 2825 apiserver.go:52] "Watching apiserver" Jan 16 17:58:09.711460 kubelet[2825]: I0116 17:58:09.711411 2825 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 16 17:58:09.792802 kubelet[2825]: I0116 17:58:09.792453 2825 kubelet.go:3219] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:09.804085 kubelet[2825]: E0116 17:58:09.803967 2825 kubelet.go:3221] "Failed creating a mirror pod" err="pods \"kube-apiserver-ci-4580-0-0-p-03fd9ab712\" already exists" pod="kube-system/kube-apiserver-ci-4580-0-0-p-03fd9ab712" Jan 16 17:58:09.823342 kubelet[2825]: I0116 17:58:09.821867 2825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-scheduler-ci-4580-0-0-p-03fd9ab712" podStartSLOduration=3.8218505130000002 podStartE2EDuration="3.821850513s" podCreationTimestamp="2026-01-16 17:58:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-16 17:58:09.820714029 +0000 UTC m=+1.208629844" watchObservedRunningTime="2026-01-16 17:58:09.821850513 +0000 UTC m=+1.209766328" Jan 16 17:58:09.848988 kubelet[2825]: I0116 17:58:09.848847 2825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-ci-4580-0-0-p-03fd9ab712" podStartSLOduration=1.8488284099999999 podStartE2EDuration="1.84882841s" podCreationTimestamp="2026-01-16 17:58:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-16 17:58:09.835688208 +0000 UTC m=+1.223604023" watchObservedRunningTime="2026-01-16 17:58:09.84882841 +0000 UTC m=+1.236744185" Jan 16 17:58:09.869880 kubelet[2825]: I0116 17:58:09.869678 2825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-controller-manager-ci-4580-0-0-p-03fd9ab712" podStartSLOduration=1.869659452 podStartE2EDuration="1.869659452s" podCreationTimestamp="2026-01-16 17:58:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-16 17:58:09.850600652 +0000 UTC m=+1.238516467" watchObservedRunningTime="2026-01-16 17:58:09.869659452 +0000 UTC m=+1.257575267" Jan 16 17:58:13.222167 kubelet[2825]: I0116 17:58:13.222129 2825 kuberuntime_manager.go:1828] "Updating runtime config through cri with podcidr" CIDR="192.168.0.0/24" Jan 16 17:58:13.224267 kubelet[2825]: I0116 17:58:13.222714 2825 kubelet_network.go:47] "Updating Pod CIDR" originalPodCIDR="" newPodCIDR="192.168.0.0/24" Jan 16 17:58:13.224311 containerd[1547]: time="2026-01-16T17:58:13.222500836Z" level=info msg="No cni config template is specified, wait for other system components to drop the config." Jan 16 17:58:13.926205 systemd[1]: Created slice kubepods-besteffort-pod35a0dc42_7f15_47e2_8d8e_7c145f861b90.slice - libcontainer container kubepods-besteffort-pod35a0dc42_7f15_47e2_8d8e_7c145f861b90.slice. Jan 16 17:58:13.943822 kubelet[2825]: I0116 17:58:13.943766 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-proxy\" (UniqueName: \"kubernetes.io/configmap/35a0dc42-7f15-47e2-8d8e-7c145f861b90-kube-proxy\") pod \"kube-proxy-dcq7q\" (UID: \"35a0dc42-7f15-47e2-8d8e-7c145f861b90\") " pod="kube-system/kube-proxy-dcq7q" Jan 16 17:58:13.943822 kubelet[2825]: I0116 17:58:13.943820 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/35a0dc42-7f15-47e2-8d8e-7c145f861b90-xtables-lock\") pod \"kube-proxy-dcq7q\" (UID: \"35a0dc42-7f15-47e2-8d8e-7c145f861b90\") " pod="kube-system/kube-proxy-dcq7q" Jan 16 17:58:13.943822 kubelet[2825]: I0116 17:58:13.943839 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/35a0dc42-7f15-47e2-8d8e-7c145f861b90-lib-modules\") pod \"kube-proxy-dcq7q\" (UID: \"35a0dc42-7f15-47e2-8d8e-7c145f861b90\") " pod="kube-system/kube-proxy-dcq7q" Jan 16 17:58:13.944166 kubelet[2825]: I0116 17:58:13.943858 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8mkw\" (UniqueName: \"kubernetes.io/projected/35a0dc42-7f15-47e2-8d8e-7c145f861b90-kube-api-access-w8mkw\") pod \"kube-proxy-dcq7q\" (UID: \"35a0dc42-7f15-47e2-8d8e-7c145f861b90\") " pod="kube-system/kube-proxy-dcq7q" Jan 16 17:58:14.054995 kubelet[2825]: E0116 17:58:14.054942 2825 projected.go:291] Couldn't get configMap kube-system/kube-root-ca.crt: configmap "kube-root-ca.crt" not found Jan 16 17:58:14.054995 kubelet[2825]: E0116 17:58:14.054978 2825 projected.go:196] Error preparing data for projected volume kube-api-access-w8mkw for pod kube-system/kube-proxy-dcq7q: configmap "kube-root-ca.crt" not found Jan 16 17:58:14.055158 kubelet[2825]: E0116 17:58:14.055056 2825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/35a0dc42-7f15-47e2-8d8e-7c145f861b90-kube-api-access-w8mkw podName:35a0dc42-7f15-47e2-8d8e-7c145f861b90 nodeName:}" failed. No retries permitted until 2026-01-16 17:58:14.555033791 +0000 UTC m=+5.942949606 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-w8mkw" (UniqueName: "kubernetes.io/projected/35a0dc42-7f15-47e2-8d8e-7c145f861b90-kube-api-access-w8mkw") pod "kube-proxy-dcq7q" (UID: "35a0dc42-7f15-47e2-8d8e-7c145f861b90") : configmap "kube-root-ca.crt" not found Jan 16 17:58:14.508926 systemd[1]: Created slice kubepods-besteffort-pode631916b_31b3_4e18_bf92_b6c4ff2a5357.slice - libcontainer container kubepods-besteffort-pode631916b_31b3_4e18_bf92_b6c4ff2a5357.slice. Jan 16 17:58:14.548945 kubelet[2825]: I0116 17:58:14.548886 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/e631916b-31b3-4e18-bf92-b6c4ff2a5357-var-lib-calico\") pod \"tigera-operator-65cdcdfd6d-vjcj7\" (UID: \"e631916b-31b3-4e18-bf92-b6c4ff2a5357\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-vjcj7" Jan 16 17:58:14.549301 kubelet[2825]: I0116 17:58:14.548949 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnjth\" (UniqueName: \"kubernetes.io/projected/e631916b-31b3-4e18-bf92-b6c4ff2a5357-kube-api-access-hnjth\") pod \"tigera-operator-65cdcdfd6d-vjcj7\" (UID: \"e631916b-31b3-4e18-bf92-b6c4ff2a5357\") " pod="tigera-operator/tigera-operator-65cdcdfd6d-vjcj7" Jan 16 17:58:14.819987 containerd[1547]: time="2026-01-16T17:58:14.819799433Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-vjcj7,Uid:e631916b-31b3-4e18-bf92-b6c4ff2a5357,Namespace:tigera-operator,Attempt:0,}" Jan 16 17:58:14.840457 containerd[1547]: time="2026-01-16T17:58:14.839782873Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-dcq7q,Uid:35a0dc42-7f15-47e2-8d8e-7c145f861b90,Namespace:kube-system,Attempt:0,}" Jan 16 17:58:14.849308 containerd[1547]: time="2026-01-16T17:58:14.849251413Z" level=info msg="connecting to shim 83e73ffb7c35851edf6292115e6e9c80652f701dbafebc043ff97d40fca5f555" address="unix:///run/containerd/s/f7dc221116a2434887973625e1d1eeb87dd55bda2462bf0c6dc3bbb8d5af58cf" namespace=k8s.io protocol=ttrpc version=3 Jan 16 17:58:14.883498 containerd[1547]: time="2026-01-16T17:58:14.883458226Z" level=info msg="connecting to shim ec6dd40cf7245da7b063e2b8283b0b28e570c2856c9b8816f8995bd016329b66" address="unix:///run/containerd/s/b2a58baf3376bbe7c883f9779f63ead24a7af031d1331a6b8e6dc571a44cbee0" namespace=k8s.io protocol=ttrpc version=3 Jan 16 17:58:14.885828 systemd[1]: Started cri-containerd-83e73ffb7c35851edf6292115e6e9c80652f701dbafebc043ff97d40fca5f555.scope - libcontainer container 83e73ffb7c35851edf6292115e6e9c80652f701dbafebc043ff97d40fca5f555. Jan 16 17:58:14.904972 kernel: kauditd_printk_skb: 32 callbacks suppressed Jan 16 17:58:14.905073 kernel: audit: type=1334 audit(1768586294.902:429): prog-id=131 op=LOAD Jan 16 17:58:14.902000 audit: BPF prog-id=131 op=LOAD Jan 16 17:58:14.904000 audit: BPF prog-id=132 op=LOAD Jan 16 17:58:14.904000 audit[2895]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2883 pid=2895 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:14.909719 kernel: audit: type=1334 audit(1768586294.904:430): prog-id=132 op=LOAD Jan 16 17:58:14.909800 kernel: audit: type=1300 audit(1768586294.904:430): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2883 pid=2895 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:14.904000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3833653733666662376333353835316564663632393231313565366539 Jan 16 17:58:14.905000 audit: BPF prog-id=132 op=UNLOAD Jan 16 17:58:14.913573 kernel: audit: type=1327 audit(1768586294.904:430): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3833653733666662376333353835316564663632393231313565366539 Jan 16 17:58:14.905000 audit[2895]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2883 pid=2895 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:14.914851 kernel: audit: type=1334 audit(1768586294.905:431): prog-id=132 op=UNLOAD Jan 16 17:58:14.905000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3833653733666662376333353835316564663632393231313565366539 Jan 16 17:58:14.920711 kernel: audit: type=1300 audit(1768586294.905:431): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2883 pid=2895 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:14.920955 kernel: audit: type=1327 audit(1768586294.905:431): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3833653733666662376333353835316564663632393231313565366539 Jan 16 17:58:14.905000 audit: BPF prog-id=133 op=LOAD Jan 16 17:58:14.905000 audit[2895]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2883 pid=2895 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:14.924943 kernel: audit: type=1334 audit(1768586294.905:432): prog-id=133 op=LOAD Jan 16 17:58:14.925125 kernel: audit: type=1300 audit(1768586294.905:432): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2883 pid=2895 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:14.905000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3833653733666662376333353835316564663632393231313565366539 Jan 16 17:58:14.905000 audit: BPF prog-id=134 op=LOAD Jan 16 17:58:14.905000 audit[2895]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2883 pid=2895 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:14.905000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3833653733666662376333353835316564663632393231313565366539 Jan 16 17:58:14.905000 audit: BPF prog-id=134 op=UNLOAD Jan 16 17:58:14.905000 audit[2895]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2883 pid=2895 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:14.905000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3833653733666662376333353835316564663632393231313565366539 Jan 16 17:58:14.905000 audit: BPF prog-id=133 op=UNLOAD Jan 16 17:58:14.905000 audit[2895]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2883 pid=2895 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:14.928663 kernel: audit: type=1327 audit(1768586294.905:432): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3833653733666662376333353835316564663632393231313565366539 Jan 16 17:58:14.905000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3833653733666662376333353835316564663632393231313565366539 Jan 16 17:58:14.905000 audit: BPF prog-id=135 op=LOAD Jan 16 17:58:14.905000 audit[2895]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2883 pid=2895 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:14.905000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3833653733666662376333353835316564663632393231313565366539 Jan 16 17:58:14.935421 systemd[1]: Started cri-containerd-ec6dd40cf7245da7b063e2b8283b0b28e570c2856c9b8816f8995bd016329b66.scope - libcontainer container ec6dd40cf7245da7b063e2b8283b0b28e570c2856c9b8816f8995bd016329b66. Jan 16 17:58:14.952000 audit: BPF prog-id=136 op=LOAD Jan 16 17:58:14.953000 audit: BPF prog-id=137 op=LOAD Jan 16 17:58:14.953000 audit[2935]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000128180 a2=98 a3=0 items=0 ppid=2916 pid=2935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:14.953000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563366464343063663732343564613762303633653262383238336230 Jan 16 17:58:14.954000 audit: BPF prog-id=137 op=UNLOAD Jan 16 17:58:14.954000 audit[2935]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2916 pid=2935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:14.954000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563366464343063663732343564613762303633653262383238336230 Jan 16 17:58:14.954000 audit: BPF prog-id=138 op=LOAD Jan 16 17:58:14.954000 audit[2935]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001283e8 a2=98 a3=0 items=0 ppid=2916 pid=2935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:14.954000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563366464343063663732343564613762303633653262383238336230 Jan 16 17:58:14.954000 audit: BPF prog-id=139 op=LOAD Jan 16 17:58:14.954000 audit[2935]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000128168 a2=98 a3=0 items=0 ppid=2916 pid=2935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:14.954000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563366464343063663732343564613762303633653262383238336230 Jan 16 17:58:14.954000 audit: BPF prog-id=139 op=UNLOAD Jan 16 17:58:14.954000 audit[2935]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2916 pid=2935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:14.954000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563366464343063663732343564613762303633653262383238336230 Jan 16 17:58:14.954000 audit: BPF prog-id=138 op=UNLOAD Jan 16 17:58:14.954000 audit[2935]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2916 pid=2935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:14.954000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563366464343063663732343564613762303633653262383238336230 Jan 16 17:58:14.954000 audit: BPF prog-id=140 op=LOAD Jan 16 17:58:14.954000 audit[2935]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000128648 a2=98 a3=0 items=0 ppid=2916 pid=2935 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:14.954000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6563366464343063663732343564613762303633653262383238336230 Jan 16 17:58:14.987867 containerd[1547]: time="2026-01-16T17:58:14.987771311Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:tigera-operator-65cdcdfd6d-vjcj7,Uid:e631916b-31b3-4e18-bf92-b6c4ff2a5357,Namespace:tigera-operator,Attempt:0,} returns sandbox id \"83e73ffb7c35851edf6292115e6e9c80652f701dbafebc043ff97d40fca5f555\"" Jan 16 17:58:14.993168 containerd[1547]: time="2026-01-16T17:58:14.992894696Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\"" Jan 16 17:58:15.002585 containerd[1547]: time="2026-01-16T17:58:15.002477169Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:kube-proxy-dcq7q,Uid:35a0dc42-7f15-47e2-8d8e-7c145f861b90,Namespace:kube-system,Attempt:0,} returns sandbox id \"ec6dd40cf7245da7b063e2b8283b0b28e570c2856c9b8816f8995bd016329b66\"" Jan 16 17:58:15.012679 containerd[1547]: time="2026-01-16T17:58:15.012630624Z" level=info msg="CreateContainer within sandbox \"ec6dd40cf7245da7b063e2b8283b0b28e570c2856c9b8816f8995bd016329b66\" for container &ContainerMetadata{Name:kube-proxy,Attempt:0,}" Jan 16 17:58:15.024622 containerd[1547]: time="2026-01-16T17:58:15.023646597Z" level=info msg="Container 7a6d7a3e0bb3ac01d6dcef786d15067b085642d207da3066fda6c37e9c86b20f: CDI devices from CRI Config.CDIDevices: []" Jan 16 17:58:15.033793 containerd[1547]: time="2026-01-16T17:58:15.033733600Z" level=info msg="CreateContainer within sandbox \"ec6dd40cf7245da7b063e2b8283b0b28e570c2856c9b8816f8995bd016329b66\" for &ContainerMetadata{Name:kube-proxy,Attempt:0,} returns container id \"7a6d7a3e0bb3ac01d6dcef786d15067b085642d207da3066fda6c37e9c86b20f\"" Jan 16 17:58:15.034858 containerd[1547]: time="2026-01-16T17:58:15.034818358Z" level=info msg="StartContainer for \"7a6d7a3e0bb3ac01d6dcef786d15067b085642d207da3066fda6c37e9c86b20f\"" Jan 16 17:58:15.036456 containerd[1547]: time="2026-01-16T17:58:15.036421611Z" level=info msg="connecting to shim 7a6d7a3e0bb3ac01d6dcef786d15067b085642d207da3066fda6c37e9c86b20f" address="unix:///run/containerd/s/b2a58baf3376bbe7c883f9779f63ead24a7af031d1331a6b8e6dc571a44cbee0" protocol=ttrpc version=3 Jan 16 17:58:15.060820 systemd[1]: Started cri-containerd-7a6d7a3e0bb3ac01d6dcef786d15067b085642d207da3066fda6c37e9c86b20f.scope - libcontainer container 7a6d7a3e0bb3ac01d6dcef786d15067b085642d207da3066fda6c37e9c86b20f. Jan 16 17:58:15.114000 audit: BPF prog-id=141 op=LOAD Jan 16 17:58:15.114000 audit[2965]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2916 pid=2965 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.114000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761366437613365306262336163303164366463656637383664313530 Jan 16 17:58:15.114000 audit: BPF prog-id=142 op=LOAD Jan 16 17:58:15.114000 audit[2965]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2916 pid=2965 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.114000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761366437613365306262336163303164366463656637383664313530 Jan 16 17:58:15.115000 audit: BPF prog-id=142 op=UNLOAD Jan 16 17:58:15.115000 audit[2965]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=2916 pid=2965 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.115000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761366437613365306262336163303164366463656637383664313530 Jan 16 17:58:15.115000 audit: BPF prog-id=141 op=UNLOAD Jan 16 17:58:15.115000 audit[2965]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=2916 pid=2965 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.115000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761366437613365306262336163303164366463656637383664313530 Jan 16 17:58:15.115000 audit: BPF prog-id=143 op=LOAD Jan 16 17:58:15.115000 audit[2965]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2916 pid=2965 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.115000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761366437613365306262336163303164366463656637383664313530 Jan 16 17:58:15.140560 containerd[1547]: time="2026-01-16T17:58:15.140505389Z" level=info msg="StartContainer for \"7a6d7a3e0bb3ac01d6dcef786d15067b085642d207da3066fda6c37e9c86b20f\" returns successfully" Jan 16 17:58:15.385000 audit[3030]: NETFILTER_CFG table=mangle:54 family=2 entries=1 op=nft_register_chain pid=3030 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:15.385000 audit[3030]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff0bdf6c0 a2=0 a3=1 items=0 ppid=2976 pid=3030 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.385000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 16 17:58:15.387000 audit[3031]: NETFILTER_CFG table=mangle:55 family=10 entries=1 op=nft_register_chain pid=3031 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:15.387000 audit[3031]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffd0e32bb0 a2=0 a3=1 items=0 ppid=2976 pid=3031 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.387000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006D616E676C65 Jan 16 17:58:15.388000 audit[3034]: NETFILTER_CFG table=nat:56 family=10 entries=1 op=nft_register_chain pid=3034 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:15.388000 audit[3034]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc0ccd6f0 a2=0 a3=1 items=0 ppid=2976 pid=3034 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.388000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 16 17:58:15.389000 audit[3032]: NETFILTER_CFG table=nat:57 family=2 entries=1 op=nft_register_chain pid=3032 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:15.389000 audit[3032]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd222b960 a2=0 a3=1 items=0 ppid=2976 pid=3032 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.389000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D74006E6174 Jan 16 17:58:15.390000 audit[3035]: NETFILTER_CFG table=filter:58 family=10 entries=1 op=nft_register_chain pid=3035 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:15.390000 audit[3035]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=fffff93e0ba0 a2=0 a3=1 items=0 ppid=2976 pid=3035 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.390000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 16 17:58:15.390000 audit[3036]: NETFILTER_CFG table=filter:59 family=2 entries=1 op=nft_register_chain pid=3036 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:15.390000 audit[3036]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffdaf19a10 a2=0 a3=1 items=0 ppid=2976 pid=3036 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.390000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D43414E415259002D740066696C746572 Jan 16 17:58:15.494000 audit[3039]: NETFILTER_CFG table=filter:60 family=2 entries=1 op=nft_register_chain pid=3039 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:15.494000 audit[3039]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=fffffe0e6d10 a2=0 a3=1 items=0 ppid=2976 pid=3039 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.494000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 16 17:58:15.497000 audit[3041]: NETFILTER_CFG table=filter:61 family=2 entries=1 op=nft_register_rule pid=3041 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:15.497000 audit[3041]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffd0560a30 a2=0 a3=1 items=0 ppid=2976 pid=3041 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.497000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73002D Jan 16 17:58:15.502000 audit[3044]: NETFILTER_CFG table=filter:62 family=2 entries=1 op=nft_register_rule pid=3044 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:15.502000 audit[3044]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffe4a3d790 a2=0 a3=1 items=0 ppid=2976 pid=3044 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.502000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73 Jan 16 17:58:15.504000 audit[3045]: NETFILTER_CFG table=filter:63 family=2 entries=1 op=nft_register_chain pid=3045 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:15.504000 audit[3045]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd4c632b0 a2=0 a3=1 items=0 ppid=2976 pid=3045 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.504000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 16 17:58:15.507000 audit[3047]: NETFILTER_CFG table=filter:64 family=2 entries=1 op=nft_register_rule pid=3047 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:15.507000 audit[3047]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffe03547e0 a2=0 a3=1 items=0 ppid=2976 pid=3047 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.507000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 16 17:58:15.508000 audit[3048]: NETFILTER_CFG table=filter:65 family=2 entries=1 op=nft_register_chain pid=3048 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:15.508000 audit[3048]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=fffffec5ef00 a2=0 a3=1 items=0 ppid=2976 pid=3048 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.508000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D5345525649434553002D740066696C746572 Jan 16 17:58:15.512000 audit[3050]: NETFILTER_CFG table=filter:66 family=2 entries=1 op=nft_register_rule pid=3050 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:15.512000 audit[3050]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffd382d320 a2=0 a3=1 items=0 ppid=2976 pid=3050 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.512000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 16 17:58:15.517000 audit[3053]: NETFILTER_CFG table=filter:67 family=2 entries=1 op=nft_register_rule pid=3053 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:15.517000 audit[3053]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffe150e140 a2=0 a3=1 items=0 ppid=2976 pid=3053 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.517000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 16 17:58:15.518000 audit[3054]: NETFILTER_CFG table=filter:68 family=2 entries=1 op=nft_register_chain pid=3054 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:15.518000 audit[3054]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc937c1e0 a2=0 a3=1 items=0 ppid=2976 pid=3054 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.518000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D464F5257415244002D740066696C746572 Jan 16 17:58:15.522000 audit[3056]: NETFILTER_CFG table=filter:69 family=2 entries=1 op=nft_register_rule pid=3056 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:15.522000 audit[3056]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffdb239850 a2=0 a3=1 items=0 ppid=2976 pid=3056 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.522000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 16 17:58:15.527000 audit[3057]: NETFILTER_CFG table=filter:70 family=2 entries=1 op=nft_register_chain pid=3057 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:15.527000 audit[3057]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffdce39c70 a2=0 a3=1 items=0 ppid=2976 pid=3057 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.527000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 16 17:58:15.530000 audit[3059]: NETFILTER_CFG table=filter:71 family=2 entries=1 op=nft_register_rule pid=3059 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:15.530000 audit[3059]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffd8777c40 a2=0 a3=1 items=0 ppid=2976 pid=3059 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.530000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F5859 Jan 16 17:58:15.534000 audit[3062]: NETFILTER_CFG table=filter:72 family=2 entries=1 op=nft_register_rule pid=3062 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:15.534000 audit[3062]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffcc22be40 a2=0 a3=1 items=0 ppid=2976 pid=3062 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.534000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F58 Jan 16 17:58:15.538000 audit[3065]: NETFILTER_CFG table=filter:73 family=2 entries=1 op=nft_register_rule pid=3065 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:15.538000 audit[3065]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffff6b51410 a2=0 a3=1 items=0 ppid=2976 pid=3065 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.538000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F Jan 16 17:58:15.540000 audit[3066]: NETFILTER_CFG table=nat:74 family=2 entries=1 op=nft_register_chain pid=3066 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:15.540000 audit[3066]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffc44656c0 a2=0 a3=1 items=0 ppid=2976 pid=3066 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.540000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D5345525649434553002D74006E6174 Jan 16 17:58:15.545000 audit[3068]: NETFILTER_CFG table=nat:75 family=2 entries=1 op=nft_register_rule pid=3068 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:15.545000 audit[3068]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=fffff1b949f0 a2=0 a3=1 items=0 ppid=2976 pid=3068 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.545000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 16 17:58:15.548000 audit[3071]: NETFILTER_CFG table=nat:76 family=2 entries=1 op=nft_register_rule pid=3071 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:15.548000 audit[3071]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=fffffb7677b0 a2=0 a3=1 items=0 ppid=2976 pid=3071 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.548000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 16 17:58:15.550000 audit[3072]: NETFILTER_CFG table=nat:77 family=2 entries=1 op=nft_register_chain pid=3072 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:15.550000 audit[3072]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd12bff90 a2=0 a3=1 items=0 ppid=2976 pid=3072 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.550000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 16 17:58:15.552000 audit[3074]: NETFILTER_CFG table=nat:78 family=2 entries=1 op=nft_register_rule pid=3074 subj=system_u:system_r:kernel_t:s0 comm="iptables" Jan 16 17:58:15.552000 audit[3074]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=532 a0=3 a1=ffffd0d86900 a2=0 a3=1 items=0 ppid=2976 pid=3074 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.552000 audit: PROCTITLE proctitle=69707461626C6573002D770035002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 16 17:58:15.573000 audit[3080]: NETFILTER_CFG table=filter:79 family=2 entries=8 op=nft_register_rule pid=3080 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:58:15.573000 audit[3080]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffd93ea0c0 a2=0 a3=1 items=0 ppid=2976 pid=3080 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.573000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:58:15.579000 audit[3080]: NETFILTER_CFG table=nat:80 family=2 entries=14 op=nft_register_chain pid=3080 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:58:15.579000 audit[3080]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5508 a0=3 a1=ffffd93ea0c0 a2=0 a3=1 items=0 ppid=2976 pid=3080 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.579000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:58:15.582000 audit[3085]: NETFILTER_CFG table=filter:81 family=10 entries=1 op=nft_register_chain pid=3085 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:15.582000 audit[3085]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=108 a0=3 a1=ffffe7573440 a2=0 a3=1 items=0 ppid=2976 pid=3085 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.582000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D45585445524E414C2D5345525649434553002D740066696C746572 Jan 16 17:58:15.585000 audit[3087]: NETFILTER_CFG table=filter:82 family=10 entries=2 op=nft_register_chain pid=3087 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:15.585000 audit[3087]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=836 a0=3 a1=ffffe29160b0 a2=0 a3=1 items=0 ppid=2976 pid=3087 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.585000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C73 Jan 16 17:58:15.589000 audit[3090]: NETFILTER_CFG table=filter:83 family=10 entries=1 op=nft_register_rule pid=3090 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:15.589000 audit[3090]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=752 a0=3 a1=ffffe7112510 a2=0 a3=1 items=0 ppid=2976 pid=3090 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.589000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E657465732065787465726E616C6C792D76697369626C65207365727669636520706F7274616C Jan 16 17:58:15.590000 audit[3091]: NETFILTER_CFG table=filter:84 family=10 entries=1 op=nft_register_chain pid=3091 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:15.590000 audit[3091]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd79b8840 a2=0 a3=1 items=0 ppid=2976 pid=3091 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.590000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4E4F4445504F525453002D740066696C746572 Jan 16 17:58:15.594000 audit[3093]: NETFILTER_CFG table=filter:85 family=10 entries=1 op=nft_register_rule pid=3093 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:15.594000 audit[3093]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffdcdda450 a2=0 a3=1 items=0 ppid=2976 pid=3093 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.594000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206865616C746820636865636B207365727669636520706F727473002D6A004B5542452D4E4F4445504F525453 Jan 16 17:58:15.596000 audit[3094]: NETFILTER_CFG table=filter:86 family=10 entries=1 op=nft_register_chain pid=3094 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:15.596000 audit[3094]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffc2fcdea0 a2=0 a3=1 items=0 ppid=2976 pid=3094 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.596000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D5345525649434553002D740066696C746572 Jan 16 17:58:15.599000 audit[3096]: NETFILTER_CFG table=filter:87 family=10 entries=1 op=nft_register_rule pid=3096 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:15.599000 audit[3096]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=744 a0=3 a1=ffffdd1f7360 a2=0 a3=1 items=0 ppid=2976 pid=3096 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.599000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 16 17:58:15.603000 audit[3099]: NETFILTER_CFG table=filter:88 family=10 entries=2 op=nft_register_chain pid=3099 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:15.603000 audit[3099]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=828 a0=3 a1=ffffca09b7c0 a2=0 a3=1 items=0 ppid=2976 pid=3099 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.603000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 16 17:58:15.605000 audit[3100]: NETFILTER_CFG table=filter:89 family=10 entries=1 op=nft_register_chain pid=3100 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:15.605000 audit[3100]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe7533cd0 a2=0 a3=1 items=0 ppid=2976 pid=3100 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.605000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D464F5257415244002D740066696C746572 Jan 16 17:58:15.609000 audit[3102]: NETFILTER_CFG table=filter:90 family=10 entries=1 op=nft_register_rule pid=3102 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:15.609000 audit[3102]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffcbe8cda0 a2=0 a3=1 items=0 ppid=2976 pid=3102 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.609000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320666F7277617264696E672072756C6573002D6A004B5542452D464F5257415244 Jan 16 17:58:15.610000 audit[3103]: NETFILTER_CFG table=filter:91 family=10 entries=1 op=nft_register_chain pid=3103 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:15.610000 audit[3103]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=104 a0=3 a1=ffffd4b6deb0 a2=0 a3=1 items=0 ppid=2976 pid=3103 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.610000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D50524F58592D4649524557414C4C002D740066696C746572 Jan 16 17:58:15.613000 audit[3105]: NETFILTER_CFG table=filter:92 family=10 entries=1 op=nft_register_rule pid=3105 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:15.613000 audit[3105]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffff0260510 a2=0 a3=1 items=0 ppid=2976 pid=3105 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.613000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F58 Jan 16 17:58:15.617000 audit[3108]: NETFILTER_CFG table=filter:93 family=10 entries=1 op=nft_register_rule pid=3108 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:15.617000 audit[3108]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=ffffd3276d80 a2=0 a3=1 items=0 ppid=2976 pid=3108 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.617000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D50524F Jan 16 17:58:15.621000 audit[3111]: NETFILTER_CFG table=filter:94 family=10 entries=1 op=nft_register_rule pid=3111 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:15.621000 audit[3111]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=748 a0=3 a1=fffff0761ce0 a2=0 a3=1 items=0 ppid=2976 pid=3111 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.621000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900464F5257415244002D740066696C746572002D6D00636F6E6E747261636B002D2D63747374617465004E4557002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573206C6F61642062616C616E636572206669726577616C6C002D6A004B5542452D5052 Jan 16 17:58:15.622000 audit[3112]: NETFILTER_CFG table=nat:95 family=10 entries=1 op=nft_register_chain pid=3112 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:15.622000 audit[3112]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=96 a0=3 a1=ffffed66c6b0 a2=0 a3=1 items=0 ppid=2976 pid=3112 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.622000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D5345525649434553002D74006E6174 Jan 16 17:58:15.625000 audit[3114]: NETFILTER_CFG table=nat:96 family=10 entries=1 op=nft_register_rule pid=3114 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:15.625000 audit[3114]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=524 a0=3 a1=ffffd32f84b0 a2=0 a3=1 items=0 ppid=2976 pid=3114 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.625000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 16 17:58:15.630000 audit[3117]: NETFILTER_CFG table=nat:97 family=10 entries=1 op=nft_register_rule pid=3117 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:15.630000 audit[3117]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=528 a0=3 a1=ffffeda4b350 a2=0 a3=1 items=0 ppid=2976 pid=3117 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.630000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900505245524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E65746573207365727669636520706F7274616C73002D6A004B5542452D5345525649434553 Jan 16 17:58:15.632000 audit[3118]: NETFILTER_CFG table=nat:98 family=10 entries=1 op=nft_register_chain pid=3118 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:15.632000 audit[3118]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffd01f7240 a2=0 a3=1 items=0 ppid=2976 pid=3118 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.632000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D504F5354524F5554494E47002D74006E6174 Jan 16 17:58:15.635000 audit[3120]: NETFILTER_CFG table=nat:99 family=10 entries=2 op=nft_register_chain pid=3120 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:15.635000 audit[3120]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=612 a0=3 a1=fffff16510a0 a2=0 a3=1 items=0 ppid=2976 pid=3120 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.635000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900504F5354524F5554494E47002D74006E6174002D6D00636F6D6D656E74002D2D636F6D6D656E74006B756265726E6574657320706F7374726F7574696E672072756C6573002D6A004B5542452D504F5354524F5554494E47 Jan 16 17:58:15.636000 audit[3121]: NETFILTER_CFG table=filter:100 family=10 entries=1 op=nft_register_chain pid=3121 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:15.636000 audit[3121]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=100 a0=3 a1=ffffe4ca0300 a2=0 a3=1 items=0 ppid=2976 pid=3121 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.636000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4E004B5542452D4649524557414C4C002D740066696C746572 Jan 16 17:58:15.638000 audit[3123]: NETFILTER_CFG table=filter:101 family=10 entries=1 op=nft_register_rule pid=3123 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:15.638000 audit[3123]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffe57da3b0 a2=0 a3=1 items=0 ppid=2976 pid=3123 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.638000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D4900494E505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 16 17:58:15.643000 audit[3126]: NETFILTER_CFG table=filter:102 family=10 entries=1 op=nft_register_rule pid=3126 subj=system_u:system_r:kernel_t:s0 comm="ip6tables" Jan 16 17:58:15.643000 audit[3126]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=228 a0=3 a1=ffffea7b7960 a2=0 a3=1 items=0 ppid=2976 pid=3126 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.643000 audit: PROCTITLE proctitle=6970367461626C6573002D770035002D49004F5554505554002D740066696C746572002D6A004B5542452D4649524557414C4C Jan 16 17:58:15.649000 audit[3128]: NETFILTER_CFG table=filter:103 family=10 entries=3 op=nft_register_rule pid=3128 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 16 17:58:15.649000 audit[3128]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2088 a0=3 a1=fffff59a4310 a2=0 a3=1 items=0 ppid=2976 pid=3128 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.649000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:58:15.649000 audit[3128]: NETFILTER_CFG table=nat:104 family=10 entries=7 op=nft_register_chain pid=3128 subj=system_u:system_r:kernel_t:s0 comm="ip6tables-resto" Jan 16 17:58:15.649000 audit[3128]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2056 a0=3 a1=fffff59a4310 a2=0 a3=1 items=0 ppid=2976 pid=3128 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="ip6tables-resto" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:15.649000 audit: PROCTITLE proctitle=6970367461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:58:17.618973 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount586365600.mount: Deactivated successfully. Jan 16 17:58:17.756913 kubelet[2825]: I0116 17:58:17.755952 2825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-proxy-dcq7q" podStartSLOduration=4.755930142 podStartE2EDuration="4.755930142s" podCreationTimestamp="2026-01-16 17:58:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-16 17:58:15.841204621 +0000 UTC m=+7.229120396" watchObservedRunningTime="2026-01-16 17:58:17.755930142 +0000 UTC m=+9.143845957" Jan 16 17:58:21.380466 containerd[1547]: time="2026-01-16T17:58:21.380394350Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator:v1.38.7\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:58:21.381969 containerd[1547]: time="2026-01-16T17:58:21.381747815Z" level=info msg="stop pulling image quay.io/tigera/operator:v1.38.7: active requests=0, bytes read=20773434" Jan 16 17:58:21.382935 containerd[1547]: time="2026-01-16T17:58:21.382859648Z" level=info msg="ImageCreate event name:\"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:58:21.385737 containerd[1547]: time="2026-01-16T17:58:21.385306504Z" level=info msg="ImageCreate event name:\"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:58:21.387218 containerd[1547]: time="2026-01-16T17:58:21.386050166Z" level=info msg="Pulled image \"quay.io/tigera/operator:v1.38.7\" with image id \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\", repo tag \"quay.io/tigera/operator:v1.38.7\", repo digest \"quay.io/tigera/operator@sha256:1b629a1403f5b6d7243f7dd523d04b8a50352a33c1d4d6970b6002a8733acf2e\", size \"22147999\" in 6.393111701s" Jan 16 17:58:21.387218 containerd[1547]: time="2026-01-16T17:58:21.386084890Z" level=info msg="PullImage \"quay.io/tigera/operator:v1.38.7\" returns image reference \"sha256:19f52e4b7ea471a91d4186e9701288b905145dc20d4928cbbf2eac8d9dfce54b\"" Jan 16 17:58:21.391260 containerd[1547]: time="2026-01-16T17:58:21.391165428Z" level=info msg="CreateContainer within sandbox \"83e73ffb7c35851edf6292115e6e9c80652f701dbafebc043ff97d40fca5f555\" for container &ContainerMetadata{Name:tigera-operator,Attempt:0,}" Jan 16 17:58:21.402789 containerd[1547]: time="2026-01-16T17:58:21.402323279Z" level=info msg="Container 96022609c912762dfff72d226954a7f0c55cf4867f4800d07139f6954c4a6208: CDI devices from CRI Config.CDIDevices: []" Jan 16 17:58:21.414174 containerd[1547]: time="2026-01-16T17:58:21.414130939Z" level=info msg="CreateContainer within sandbox \"83e73ffb7c35851edf6292115e6e9c80652f701dbafebc043ff97d40fca5f555\" for &ContainerMetadata{Name:tigera-operator,Attempt:0,} returns container id \"96022609c912762dfff72d226954a7f0c55cf4867f4800d07139f6954c4a6208\"" Jan 16 17:58:21.417269 containerd[1547]: time="2026-01-16T17:58:21.416034960Z" level=info msg="StartContainer for \"96022609c912762dfff72d226954a7f0c55cf4867f4800d07139f6954c4a6208\"" Jan 16 17:58:21.418398 containerd[1547]: time="2026-01-16T17:58:21.418365200Z" level=info msg="connecting to shim 96022609c912762dfff72d226954a7f0c55cf4867f4800d07139f6954c4a6208" address="unix:///run/containerd/s/f7dc221116a2434887973625e1d1eeb87dd55bda2462bf0c6dc3bbb8d5af58cf" protocol=ttrpc version=3 Jan 16 17:58:21.439791 systemd[1]: Started cri-containerd-96022609c912762dfff72d226954a7f0c55cf4867f4800d07139f6954c4a6208.scope - libcontainer container 96022609c912762dfff72d226954a7f0c55cf4867f4800d07139f6954c4a6208. Jan 16 17:58:21.452000 audit: BPF prog-id=144 op=LOAD Jan 16 17:58:21.454924 kernel: kauditd_printk_skb: 202 callbacks suppressed Jan 16 17:58:21.454994 kernel: audit: type=1334 audit(1768586301.452:501): prog-id=144 op=LOAD Jan 16 17:58:21.454000 audit: BPF prog-id=145 op=LOAD Jan 16 17:58:21.454000 audit[3137]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2883 pid=3137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:21.458434 kernel: audit: type=1334 audit(1768586301.454:502): prog-id=145 op=LOAD Jan 16 17:58:21.458530 kernel: audit: type=1300 audit(1768586301.454:502): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=2883 pid=3137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:21.458598 kernel: audit: type=1327 audit(1768586301.454:502): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936303232363039633931323736326466666637326432323639353461 Jan 16 17:58:21.454000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936303232363039633931323736326466666637326432323639353461 Jan 16 17:58:21.454000 audit: BPF prog-id=145 op=UNLOAD Jan 16 17:58:21.454000 audit[3137]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2883 pid=3137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:21.463543 kernel: audit: type=1334 audit(1768586301.454:503): prog-id=145 op=UNLOAD Jan 16 17:58:21.463606 kernel: audit: type=1300 audit(1768586301.454:503): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2883 pid=3137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:21.454000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936303232363039633931323736326466666637326432323639353461 Jan 16 17:58:21.465699 kernel: audit: type=1327 audit(1768586301.454:503): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936303232363039633931323736326466666637326432323639353461 Jan 16 17:58:21.454000 audit: BPF prog-id=146 op=LOAD Jan 16 17:58:21.467569 kernel: audit: type=1334 audit(1768586301.454:504): prog-id=146 op=LOAD Jan 16 17:58:21.467624 kernel: audit: type=1300 audit(1768586301.454:504): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2883 pid=3137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:21.454000 audit[3137]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=2883 pid=3137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:21.454000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936303232363039633931323736326466666637326432323639353461 Jan 16 17:58:21.472768 kernel: audit: type=1327 audit(1768586301.454:504): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936303232363039633931323736326466666637326432323639353461 Jan 16 17:58:21.454000 audit: BPF prog-id=147 op=LOAD Jan 16 17:58:21.454000 audit[3137]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=2883 pid=3137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:21.454000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936303232363039633931323736326466666637326432323639353461 Jan 16 17:58:21.459000 audit: BPF prog-id=147 op=UNLOAD Jan 16 17:58:21.459000 audit[3137]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2883 pid=3137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:21.459000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936303232363039633931323736326466666637326432323639353461 Jan 16 17:58:21.459000 audit: BPF prog-id=146 op=UNLOAD Jan 16 17:58:21.459000 audit[3137]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2883 pid=3137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:21.459000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936303232363039633931323736326466666637326432323639353461 Jan 16 17:58:21.459000 audit: BPF prog-id=148 op=LOAD Jan 16 17:58:21.459000 audit[3137]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=2883 pid=3137 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:21.459000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3936303232363039633931323736326466666637326432323639353461 Jan 16 17:58:21.494347 containerd[1547]: time="2026-01-16T17:58:21.494284058Z" level=info msg="StartContainer for \"96022609c912762dfff72d226954a7f0c55cf4867f4800d07139f6954c4a6208\" returns successfully" Jan 16 17:58:21.843151 kubelet[2825]: I0116 17:58:21.843078 2825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="tigera-operator/tigera-operator-65cdcdfd6d-vjcj7" podStartSLOduration=1.446366883 podStartE2EDuration="7.8429311s" podCreationTimestamp="2026-01-16 17:58:14 +0000 UTC" firstStartedPulling="2026-01-16 17:58:14.990712717 +0000 UTC m=+6.378628532" lastFinishedPulling="2026-01-16 17:58:21.387276974 +0000 UTC m=+12.775192749" observedRunningTime="2026-01-16 17:58:21.842802283 +0000 UTC m=+13.230718138" watchObservedRunningTime="2026-01-16 17:58:21.8429311 +0000 UTC m=+13.230846955" Jan 16 17:58:27.457968 sudo[1884]: pam_unix(sudo:session): session closed for user root Jan 16 17:58:27.456000 audit[1884]: USER_END pid=1884 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 17:58:27.460821 kernel: kauditd_printk_skb: 12 callbacks suppressed Jan 16 17:58:27.460875 kernel: audit: type=1106 audit(1768586307.456:509): pid=1884 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_limits,pam_env,pam_umask,pam_unix acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 17:58:27.456000 audit[1884]: CRED_DISP pid=1884 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 17:58:27.466731 kernel: audit: type=1104 audit(1768586307.456:510): pid=1884 uid=500 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="root" exe="/usr/bin/sudo" hostname=? addr=? terminal=? res=success' Jan 16 17:58:27.563259 sshd[1883]: Connection closed by 68.220.241.50 port 39474 Jan 16 17:58:27.563863 sshd-session[1879]: pam_unix(sshd:session): session closed for user core Jan 16 17:58:27.565000 audit[1879]: USER_END pid=1879 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 17:58:27.565000 audit[1879]: CRED_DISP pid=1879 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 17:58:27.571443 kernel: audit: type=1106 audit(1768586307.565:511): pid=1879 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 17:58:27.571511 kernel: audit: type=1104 audit(1768586307.565:512): pid=1879 uid=0 auid=500 ses=8 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 17:58:27.571000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-49.12.189.56:22-68.220.241.50:39474 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:27.572761 systemd[1]: sshd@7-49.12.189.56:22-68.220.241.50:39474.service: Deactivated successfully. Jan 16 17:58:27.572855 systemd-logind[1532]: Session 8 logged out. Waiting for processes to exit. Jan 16 17:58:27.577306 systemd[1]: session-8.scope: Deactivated successfully. Jan 16 17:58:27.577570 kernel: audit: type=1131 audit(1768586307.571:513): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@7-49.12.189.56:22-68.220.241.50:39474 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:58:27.577649 systemd[1]: session-8.scope: Consumed 7.352s CPU time, 221.8M memory peak. Jan 16 17:58:27.580592 systemd-logind[1532]: Removed session 8. Jan 16 17:58:32.012000 audit[3218]: NETFILTER_CFG table=filter:105 family=2 entries=15 op=nft_register_rule pid=3218 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:58:32.012000 audit[3218]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffd85e84e0 a2=0 a3=1 items=0 ppid=2976 pid=3218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:32.017619 kernel: audit: type=1325 audit(1768586312.012:514): table=filter:105 family=2 entries=15 op=nft_register_rule pid=3218 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:58:32.017683 kernel: audit: type=1300 audit(1768586312.012:514): arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffd85e84e0 a2=0 a3=1 items=0 ppid=2976 pid=3218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:32.012000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:58:32.019237 kernel: audit: type=1327 audit(1768586312.012:514): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:58:32.021000 audit[3218]: NETFILTER_CFG table=nat:106 family=2 entries=12 op=nft_register_rule pid=3218 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:58:32.021000 audit[3218]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffd85e84e0 a2=0 a3=1 items=0 ppid=2976 pid=3218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:32.029543 kernel: audit: type=1325 audit(1768586312.021:515): table=nat:106 family=2 entries=12 op=nft_register_rule pid=3218 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:58:32.029637 kernel: audit: type=1300 audit(1768586312.021:515): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffd85e84e0 a2=0 a3=1 items=0 ppid=2976 pid=3218 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:32.021000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:58:33.043000 audit[3220]: NETFILTER_CFG table=filter:107 family=2 entries=16 op=nft_register_rule pid=3220 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:58:33.046178 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 16 17:58:33.046258 kernel: audit: type=1325 audit(1768586313.043:516): table=filter:107 family=2 entries=16 op=nft_register_rule pid=3220 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:58:33.043000 audit[3220]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffc248a3b0 a2=0 a3=1 items=0 ppid=2976 pid=3220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:33.049074 kernel: audit: type=1300 audit(1768586313.043:516): arch=c00000b7 syscall=211 success=yes exit=5992 a0=3 a1=ffffc248a3b0 a2=0 a3=1 items=0 ppid=2976 pid=3220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:33.043000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:58:33.051250 kernel: audit: type=1327 audit(1768586313.043:516): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:58:33.058000 audit[3220]: NETFILTER_CFG table=nat:108 family=2 entries=12 op=nft_register_rule pid=3220 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:58:33.058000 audit[3220]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffc248a3b0 a2=0 a3=1 items=0 ppid=2976 pid=3220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:33.062712 kernel: audit: type=1325 audit(1768586313.058:517): table=nat:108 family=2 entries=12 op=nft_register_rule pid=3220 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:58:33.062795 kernel: audit: type=1300 audit(1768586313.058:517): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffc248a3b0 a2=0 a3=1 items=0 ppid=2976 pid=3220 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:33.058000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:58:33.063796 kernel: audit: type=1327 audit(1768586313.058:517): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:58:36.805000 audit[3222]: NETFILTER_CFG table=filter:109 family=2 entries=17 op=nft_register_rule pid=3222 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:58:36.805000 audit[3222]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=fffffb3b6250 a2=0 a3=1 items=0 ppid=2976 pid=3222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:36.811887 kernel: audit: type=1325 audit(1768586316.805:518): table=filter:109 family=2 entries=17 op=nft_register_rule pid=3222 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:58:36.811943 kernel: audit: type=1300 audit(1768586316.805:518): arch=c00000b7 syscall=211 success=yes exit=6736 a0=3 a1=fffffb3b6250 a2=0 a3=1 items=0 ppid=2976 pid=3222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:36.805000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:58:36.811000 audit[3222]: NETFILTER_CFG table=nat:110 family=2 entries=12 op=nft_register_rule pid=3222 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:58:36.814688 kernel: audit: type=1327 audit(1768586316.805:518): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:58:36.814744 kernel: audit: type=1325 audit(1768586316.811:519): table=nat:110 family=2 entries=12 op=nft_register_rule pid=3222 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:58:36.811000 audit[3222]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffffb3b6250 a2=0 a3=1 items=0 ppid=2976 pid=3222 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:36.811000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:58:37.841000 audit[3224]: NETFILTER_CFG table=filter:111 family=2 entries=19 op=nft_register_rule pid=3224 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:58:37.841000 audit[3224]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=fffffd217570 a2=0 a3=1 items=0 ppid=2976 pid=3224 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:37.841000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:58:37.847000 audit[3224]: NETFILTER_CFG table=nat:112 family=2 entries=12 op=nft_register_rule pid=3224 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:58:37.847000 audit[3224]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=fffffd217570 a2=0 a3=1 items=0 ppid=2976 pid=3224 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:37.847000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:58:40.392000 audit[3226]: NETFILTER_CFG table=filter:113 family=2 entries=21 op=nft_register_rule pid=3226 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:58:40.396333 kernel: kauditd_printk_skb: 8 callbacks suppressed Jan 16 17:58:40.396408 kernel: audit: type=1325 audit(1768586320.392:522): table=filter:113 family=2 entries=21 op=nft_register_rule pid=3226 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:58:40.392000 audit[3226]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffe40a1c50 a2=0 a3=1 items=0 ppid=2976 pid=3226 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:40.398912 kernel: audit: type=1300 audit(1768586320.392:522): arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffe40a1c50 a2=0 a3=1 items=0 ppid=2976 pid=3226 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:40.392000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:58:40.400487 kernel: audit: type=1327 audit(1768586320.392:522): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:58:40.400000 audit[3226]: NETFILTER_CFG table=nat:114 family=2 entries=12 op=nft_register_rule pid=3226 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:58:40.400000 audit[3226]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffe40a1c50 a2=0 a3=1 items=0 ppid=2976 pid=3226 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:40.405869 kernel: audit: type=1325 audit(1768586320.400:523): table=nat:114 family=2 entries=12 op=nft_register_rule pid=3226 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:58:40.405932 kernel: audit: type=1300 audit(1768586320.400:523): arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffe40a1c50 a2=0 a3=1 items=0 ppid=2976 pid=3226 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:40.405961 kernel: audit: type=1327 audit(1768586320.400:523): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:58:40.400000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:58:40.436592 kubelet[2825]: E0116 17:58:40.436522 2825 reflector.go:205] "Failed to watch" err="failed to list *v1.Secret: secrets \"typha-certs\" is forbidden: User \"system:node:ci-4580-0-0-p-03fd9ab712\" cannot list resource \"secrets\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4580-0-0-p-03fd9ab712' and this object" logger="UnhandledError" reflector="object-\"calico-system\"/\"typha-certs\"" type="*v1.Secret" Jan 16 17:58:40.436929 kubelet[2825]: E0116 17:58:40.436623 2825 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"tigera-ca-bundle\" is forbidden: User \"system:node:ci-4580-0-0-p-03fd9ab712\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4580-0-0-p-03fd9ab712' and this object" logger="UnhandledError" reflector="object-\"calico-system\"/\"tigera-ca-bundle\"" type="*v1.ConfigMap" Jan 16 17:58:40.436929 kubelet[2825]: E0116 17:58:40.436661 2825 status_manager.go:1018] "Failed to get status for pod" err="pods \"calico-typha-774699bc97-p7jq9\" is forbidden: User \"system:node:ci-4580-0-0-p-03fd9ab712\" cannot get resource \"pods\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4580-0-0-p-03fd9ab712' and this object" podUID="4eb5d9a2-5ef3-410c-b11d-03c24ab02e1d" pod="calico-system/calico-typha-774699bc97-p7jq9" Jan 16 17:58:40.436929 kubelet[2825]: E0116 17:58:40.436712 2825 reflector.go:205] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ci-4580-0-0-p-03fd9ab712\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"calico-system\": no relationship found between node 'ci-4580-0-0-p-03fd9ab712' and this object" logger="UnhandledError" reflector="object-\"calico-system\"/\"kube-root-ca.crt\"" type="*v1.ConfigMap" Jan 16 17:58:40.439821 systemd[1]: Created slice kubepods-besteffort-pod4eb5d9a2_5ef3_410c_b11d_03c24ab02e1d.slice - libcontainer container kubepods-besteffort-pod4eb5d9a2_5ef3_410c_b11d_03c24ab02e1d.slice. Jan 16 17:58:40.519237 kubelet[2825]: I0116 17:58:40.518037 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4eb5d9a2-5ef3-410c-b11d-03c24ab02e1d-tigera-ca-bundle\") pod \"calico-typha-774699bc97-p7jq9\" (UID: \"4eb5d9a2-5ef3-410c-b11d-03c24ab02e1d\") " pod="calico-system/calico-typha-774699bc97-p7jq9" Jan 16 17:58:40.519237 kubelet[2825]: I0116 17:58:40.518646 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"typha-certs\" (UniqueName: \"kubernetes.io/secret/4eb5d9a2-5ef3-410c-b11d-03c24ab02e1d-typha-certs\") pod \"calico-typha-774699bc97-p7jq9\" (UID: \"4eb5d9a2-5ef3-410c-b11d-03c24ab02e1d\") " pod="calico-system/calico-typha-774699bc97-p7jq9" Jan 16 17:58:40.519237 kubelet[2825]: I0116 17:58:40.518682 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fpbm\" (UniqueName: \"kubernetes.io/projected/4eb5d9a2-5ef3-410c-b11d-03c24ab02e1d-kube-api-access-7fpbm\") pod \"calico-typha-774699bc97-p7jq9\" (UID: \"4eb5d9a2-5ef3-410c-b11d-03c24ab02e1d\") " pod="calico-system/calico-typha-774699bc97-p7jq9" Jan 16 17:58:40.666796 systemd[1]: Created slice kubepods-besteffort-pod19c043e8_4f90_4e85_af85_45e376d669ab.slice - libcontainer container kubepods-besteffort-pod19c043e8_4f90_4e85_af85_45e376d669ab.slice. Jan 16 17:58:40.719698 kubelet[2825]: I0116 17:58:40.719644 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-net-dir\" (UniqueName: \"kubernetes.io/host-path/19c043e8-4f90-4e85-af85-45e376d669ab-cni-net-dir\") pod \"calico-node-wgssq\" (UID: \"19c043e8-4f90-4e85-af85-45e376d669ab\") " pod="calico-system/calico-node-wgssq" Jan 16 17:58:40.720001 kubelet[2825]: I0116 17:58:40.719980 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/19c043e8-4f90-4e85-af85-45e376d669ab-tigera-ca-bundle\") pod \"calico-node-wgssq\" (UID: \"19c043e8-4f90-4e85-af85-45e376d669ab\") " pod="calico-system/calico-node-wgssq" Jan 16 17:58:40.720180 kubelet[2825]: I0116 17:58:40.720163 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-log-dir\" (UniqueName: \"kubernetes.io/host-path/19c043e8-4f90-4e85-af85-45e376d669ab-cni-log-dir\") pod \"calico-node-wgssq\" (UID: \"19c043e8-4f90-4e85-af85-45e376d669ab\") " pod="calico-system/calico-node-wgssq" Jan 16 17:58:40.720352 kubelet[2825]: I0116 17:58:40.720328 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"flexvol-driver-host\" (UniqueName: \"kubernetes.io/host-path/19c043e8-4f90-4e85-af85-45e376d669ab-flexvol-driver-host\") pod \"calico-node-wgssq\" (UID: \"19c043e8-4f90-4e85-af85-45e376d669ab\") " pod="calico-system/calico-node-wgssq" Jan 16 17:58:40.720502 kubelet[2825]: I0116 17:58:40.720485 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"policysync\" (UniqueName: \"kubernetes.io/host-path/19c043e8-4f90-4e85-af85-45e376d669ab-policysync\") pod \"calico-node-wgssq\" (UID: \"19c043e8-4f90-4e85-af85-45e376d669ab\") " pod="calico-system/calico-node-wgssq" Jan 16 17:58:40.720750 kubelet[2825]: I0116 17:58:40.720729 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/19c043e8-4f90-4e85-af85-45e376d669ab-lib-modules\") pod \"calico-node-wgssq\" (UID: \"19c043e8-4f90-4e85-af85-45e376d669ab\") " pod="calico-system/calico-node-wgssq" Jan 16 17:58:40.720853 kubelet[2825]: I0116 17:58:40.720836 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-calico\" (UniqueName: \"kubernetes.io/host-path/19c043e8-4f90-4e85-af85-45e376d669ab-var-lib-calico\") pod \"calico-node-wgssq\" (UID: \"19c043e8-4f90-4e85-af85-45e376d669ab\") " pod="calico-system/calico-node-wgssq" Jan 16 17:58:40.720953 kubelet[2825]: I0116 17:58:40.720925 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-run-calico\" (UniqueName: \"kubernetes.io/host-path/19c043e8-4f90-4e85-af85-45e376d669ab-var-run-calico\") pod \"calico-node-wgssq\" (UID: \"19c043e8-4f90-4e85-af85-45e376d669ab\") " pod="calico-system/calico-node-wgssq" Jan 16 17:58:40.721137 kubelet[2825]: I0116 17:58:40.721041 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"xtables-lock\" (UniqueName: \"kubernetes.io/host-path/19c043e8-4f90-4e85-af85-45e376d669ab-xtables-lock\") pod \"calico-node-wgssq\" (UID: \"19c043e8-4f90-4e85-af85-45e376d669ab\") " pod="calico-system/calico-node-wgssq" Jan 16 17:58:40.721137 kubelet[2825]: I0116 17:58:40.721103 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-bin-dir\" (UniqueName: \"kubernetes.io/host-path/19c043e8-4f90-4e85-af85-45e376d669ab-cni-bin-dir\") pod \"calico-node-wgssq\" (UID: \"19c043e8-4f90-4e85-af85-45e376d669ab\") " pod="calico-system/calico-node-wgssq" Jan 16 17:58:40.721288 kubelet[2825]: I0116 17:58:40.721142 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-certs\" (UniqueName: \"kubernetes.io/secret/19c043e8-4f90-4e85-af85-45e376d669ab-node-certs\") pod \"calico-node-wgssq\" (UID: \"19c043e8-4f90-4e85-af85-45e376d669ab\") " pod="calico-system/calico-node-wgssq" Jan 16 17:58:40.721288 kubelet[2825]: I0116 17:58:40.721169 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-txmfg\" (UniqueName: \"kubernetes.io/projected/19c043e8-4f90-4e85-af85-45e376d669ab-kube-api-access-txmfg\") pod \"calico-node-wgssq\" (UID: \"19c043e8-4f90-4e85-af85-45e376d669ab\") " pod="calico-system/calico-node-wgssq" Jan 16 17:58:40.825201 kubelet[2825]: E0116 17:58:40.825146 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.825201 kubelet[2825]: W0116 17:58:40.825176 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.825201 kubelet[2825]: E0116 17:58:40.825203 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.826395 kubelet[2825]: E0116 17:58:40.825361 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.826395 kubelet[2825]: W0116 17:58:40.825368 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.826395 kubelet[2825]: E0116 17:58:40.825376 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.826395 kubelet[2825]: E0116 17:58:40.825843 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.826395 kubelet[2825]: W0116 17:58:40.825856 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.826395 kubelet[2825]: E0116 17:58:40.825867 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.826583 kubelet[2825]: E0116 17:58:40.826457 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.826583 kubelet[2825]: W0116 17:58:40.826470 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.826583 kubelet[2825]: E0116 17:58:40.826484 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.829574 kubelet[2825]: E0116 17:58:40.828613 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.829574 kubelet[2825]: W0116 17:58:40.828635 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.829574 kubelet[2825]: E0116 17:58:40.828651 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.829574 kubelet[2825]: E0116 17:58:40.829125 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.829574 kubelet[2825]: W0116 17:58:40.829138 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.829574 kubelet[2825]: E0116 17:58:40.829150 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.829754 kubelet[2825]: E0116 17:58:40.829616 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.829754 kubelet[2825]: W0116 17:58:40.829627 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.829754 kubelet[2825]: E0116 17:58:40.829638 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.830078 kubelet[2825]: E0116 17:58:40.830006 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.830078 kubelet[2825]: W0116 17:58:40.830039 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.830078 kubelet[2825]: E0116 17:58:40.830051 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.831137 kubelet[2825]: E0116 17:58:40.830844 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.831137 kubelet[2825]: W0116 17:58:40.830864 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.831137 kubelet[2825]: E0116 17:58:40.830875 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.868779 kubelet[2825]: E0116 17:58:40.868251 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lqf7c" podUID="8f14bccb-f353-467f-b549-674ae9114a0e" Jan 16 17:58:40.911961 kubelet[2825]: E0116 17:58:40.911911 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.911961 kubelet[2825]: W0116 17:58:40.911943 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.911961 kubelet[2825]: E0116 17:58:40.911965 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.912271 kubelet[2825]: E0116 17:58:40.912175 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.912271 kubelet[2825]: W0116 17:58:40.912183 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.912271 kubelet[2825]: E0116 17:58:40.912224 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.912726 kubelet[2825]: E0116 17:58:40.912479 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.912726 kubelet[2825]: W0116 17:58:40.912494 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.912726 kubelet[2825]: E0116 17:58:40.912505 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.913830 kubelet[2825]: E0116 17:58:40.913798 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.913830 kubelet[2825]: W0116 17:58:40.913818 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.913830 kubelet[2825]: E0116 17:58:40.913832 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.914037 kubelet[2825]: E0116 17:58:40.914001 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.914037 kubelet[2825]: W0116 17:58:40.914015 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.914037 kubelet[2825]: E0116 17:58:40.914037 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.914186 kubelet[2825]: E0116 17:58:40.914172 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.914186 kubelet[2825]: W0116 17:58:40.914183 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.914186 kubelet[2825]: E0116 17:58:40.914191 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.914474 kubelet[2825]: E0116 17:58:40.914454 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.914474 kubelet[2825]: W0116 17:58:40.914471 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.914557 kubelet[2825]: E0116 17:58:40.914482 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.914785 kubelet[2825]: E0116 17:58:40.914765 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.914785 kubelet[2825]: W0116 17:58:40.914782 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.914855 kubelet[2825]: E0116 17:58:40.914792 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.914972 kubelet[2825]: E0116 17:58:40.914954 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.914972 kubelet[2825]: W0116 17:58:40.914969 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.915047 kubelet[2825]: E0116 17:58:40.914977 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.915178 kubelet[2825]: E0116 17:58:40.915162 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.915178 kubelet[2825]: W0116 17:58:40.915174 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.915321 kubelet[2825]: E0116 17:58:40.915183 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.917143 kubelet[2825]: E0116 17:58:40.917088 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.917143 kubelet[2825]: W0116 17:58:40.917108 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.917143 kubelet[2825]: E0116 17:58:40.917122 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.917358 kubelet[2825]: E0116 17:58:40.917283 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.917358 kubelet[2825]: W0116 17:58:40.917291 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.917358 kubelet[2825]: E0116 17:58:40.917311 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.917502 kubelet[2825]: E0116 17:58:40.917480 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.917502 kubelet[2825]: W0116 17:58:40.917495 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.917502 kubelet[2825]: E0116 17:58:40.917504 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.918167 kubelet[2825]: E0116 17:58:40.918085 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.918167 kubelet[2825]: W0116 17:58:40.918106 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.918167 kubelet[2825]: E0116 17:58:40.918118 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.918354 kubelet[2825]: E0116 17:58:40.918317 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.918354 kubelet[2825]: W0116 17:58:40.918342 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.918354 kubelet[2825]: E0116 17:58:40.918353 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.918987 kubelet[2825]: E0116 17:58:40.918963 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.918987 kubelet[2825]: W0116 17:58:40.918979 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.919095 kubelet[2825]: E0116 17:58:40.918992 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.919474 kubelet[2825]: E0116 17:58:40.919447 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.919474 kubelet[2825]: W0116 17:58:40.919466 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.919562 kubelet[2825]: E0116 17:58:40.919481 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.921321 kubelet[2825]: E0116 17:58:40.921285 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.921321 kubelet[2825]: W0116 17:58:40.921307 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.921321 kubelet[2825]: E0116 17:58:40.921320 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.921531 kubelet[2825]: E0116 17:58:40.921506 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.921531 kubelet[2825]: W0116 17:58:40.921521 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.921531 kubelet[2825]: E0116 17:58:40.921531 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.921773 kubelet[2825]: E0116 17:58:40.921752 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.921773 kubelet[2825]: W0116 17:58:40.921765 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.921773 kubelet[2825]: E0116 17:58:40.921775 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.922685 kubelet[2825]: E0116 17:58:40.922602 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.922685 kubelet[2825]: W0116 17:58:40.922618 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.922685 kubelet[2825]: E0116 17:58:40.922630 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.922685 kubelet[2825]: I0116 17:58:40.922658 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/8f14bccb-f353-467f-b549-674ae9114a0e-socket-dir\") pod \"csi-node-driver-lqf7c\" (UID: \"8f14bccb-f353-467f-b549-674ae9114a0e\") " pod="calico-system/csi-node-driver-lqf7c" Jan 16 17:58:40.923681 kubelet[2825]: E0116 17:58:40.923651 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.923681 kubelet[2825]: W0116 17:58:40.923672 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.923779 kubelet[2825]: E0116 17:58:40.923688 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.923779 kubelet[2825]: I0116 17:58:40.923714 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"varrun\" (UniqueName: \"kubernetes.io/host-path/8f14bccb-f353-467f-b549-674ae9114a0e-varrun\") pod \"csi-node-driver-lqf7c\" (UID: \"8f14bccb-f353-467f-b549-674ae9114a0e\") " pod="calico-system/csi-node-driver-lqf7c" Jan 16 17:58:40.923918 kubelet[2825]: E0116 17:58:40.923899 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.923918 kubelet[2825]: W0116 17:58:40.923914 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.923990 kubelet[2825]: E0116 17:58:40.923923 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.923990 kubelet[2825]: I0116 17:58:40.923943 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6hztj\" (UniqueName: \"kubernetes.io/projected/8f14bccb-f353-467f-b549-674ae9114a0e-kube-api-access-6hztj\") pod \"csi-node-driver-lqf7c\" (UID: \"8f14bccb-f353-467f-b549-674ae9114a0e\") " pod="calico-system/csi-node-driver-lqf7c" Jan 16 17:58:40.924150 kubelet[2825]: E0116 17:58:40.924127 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.924150 kubelet[2825]: W0116 17:58:40.924143 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.924150 kubelet[2825]: E0116 17:58:40.924152 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.924273 kubelet[2825]: I0116 17:58:40.924249 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/8f14bccb-f353-467f-b549-674ae9114a0e-kubelet-dir\") pod \"csi-node-driver-lqf7c\" (UID: \"8f14bccb-f353-467f-b549-674ae9114a0e\") " pod="calico-system/csi-node-driver-lqf7c" Jan 16 17:58:40.924718 kubelet[2825]: E0116 17:58:40.924689 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.924718 kubelet[2825]: W0116 17:58:40.924710 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.924718 kubelet[2825]: E0116 17:58:40.924721 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.924905 kubelet[2825]: E0116 17:58:40.924882 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.924905 kubelet[2825]: W0116 17:58:40.924896 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.924905 kubelet[2825]: E0116 17:58:40.924904 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.925122 kubelet[2825]: E0116 17:58:40.925051 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.925122 kubelet[2825]: W0116 17:58:40.925059 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.925122 kubelet[2825]: E0116 17:58:40.925066 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.925289 kubelet[2825]: E0116 17:58:40.925199 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.925289 kubelet[2825]: W0116 17:58:40.925211 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.925289 kubelet[2825]: E0116 17:58:40.925218 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.925289 kubelet[2825]: I0116 17:58:40.925242 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/8f14bccb-f353-467f-b549-674ae9114a0e-registration-dir\") pod \"csi-node-driver-lqf7c\" (UID: \"8f14bccb-f353-467f-b549-674ae9114a0e\") " pod="calico-system/csi-node-driver-lqf7c" Jan 16 17:58:40.925699 kubelet[2825]: E0116 17:58:40.925665 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.925699 kubelet[2825]: W0116 17:58:40.925685 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.925699 kubelet[2825]: E0116 17:58:40.925698 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.925933 kubelet[2825]: E0116 17:58:40.925862 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.925933 kubelet[2825]: W0116 17:58:40.925870 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.925933 kubelet[2825]: E0116 17:58:40.925879 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.926729 kubelet[2825]: E0116 17:58:40.926709 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.926729 kubelet[2825]: W0116 17:58:40.926726 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.926861 kubelet[2825]: E0116 17:58:40.926741 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.926945 kubelet[2825]: E0116 17:58:40.926922 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.926945 kubelet[2825]: W0116 17:58:40.926938 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.927012 kubelet[2825]: E0116 17:58:40.926948 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.927381 kubelet[2825]: E0116 17:58:40.927355 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.927381 kubelet[2825]: W0116 17:58:40.927374 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.927381 kubelet[2825]: E0116 17:58:40.927385 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.927573 kubelet[2825]: E0116 17:58:40.927556 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.927621 kubelet[2825]: W0116 17:58:40.927578 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.927621 kubelet[2825]: E0116 17:58:40.927588 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:40.928742 kubelet[2825]: E0116 17:58:40.928714 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:40.928742 kubelet[2825]: W0116 17:58:40.928735 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:40.928742 kubelet[2825]: E0116 17:58:40.928747 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.027351 kubelet[2825]: E0116 17:58:41.027309 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.027351 kubelet[2825]: W0116 17:58:41.027343 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.027695 kubelet[2825]: E0116 17:58:41.027371 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.027982 kubelet[2825]: E0116 17:58:41.027909 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.027982 kubelet[2825]: W0116 17:58:41.027952 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.027982 kubelet[2825]: E0116 17:58:41.027980 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.028353 kubelet[2825]: E0116 17:58:41.028274 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.028353 kubelet[2825]: W0116 17:58:41.028289 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.028353 kubelet[2825]: E0116 17:58:41.028315 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.028712 kubelet[2825]: E0116 17:58:41.028463 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.028712 kubelet[2825]: W0116 17:58:41.028471 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.028712 kubelet[2825]: E0116 17:58:41.028479 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.029146 kubelet[2825]: E0116 17:58:41.029123 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.029146 kubelet[2825]: W0116 17:58:41.029142 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.029595 kubelet[2825]: E0116 17:58:41.029155 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.029595 kubelet[2825]: E0116 17:58:41.029348 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.029595 kubelet[2825]: W0116 17:58:41.029356 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.029595 kubelet[2825]: E0116 17:58:41.029365 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.029945 kubelet[2825]: E0116 17:58:41.029826 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.029945 kubelet[2825]: W0116 17:58:41.029845 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.029945 kubelet[2825]: E0116 17:58:41.029856 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.030251 kubelet[2825]: E0116 17:58:41.030047 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.030251 kubelet[2825]: W0116 17:58:41.030055 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.030251 kubelet[2825]: E0116 17:58:41.030064 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.031018 kubelet[2825]: E0116 17:58:41.030665 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.031018 kubelet[2825]: W0116 17:58:41.030678 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.031018 kubelet[2825]: E0116 17:58:41.030689 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.031018 kubelet[2825]: E0116 17:58:41.030840 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.031018 kubelet[2825]: W0116 17:58:41.030847 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.031018 kubelet[2825]: E0116 17:58:41.030856 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.031664 kubelet[2825]: E0116 17:58:41.031643 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.031664 kubelet[2825]: W0116 17:58:41.031658 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.031664 kubelet[2825]: E0116 17:58:41.031670 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.031849 kubelet[2825]: E0116 17:58:41.031837 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.031849 kubelet[2825]: W0116 17:58:41.031845 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.031896 kubelet[2825]: E0116 17:58:41.031853 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.032046 kubelet[2825]: E0116 17:58:41.032017 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.032265 kubelet[2825]: W0116 17:58:41.032071 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.032384 kubelet[2825]: E0116 17:58:41.032366 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.032612 kubelet[2825]: E0116 17:58:41.032590 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.032612 kubelet[2825]: W0116 17:58:41.032606 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.032706 kubelet[2825]: E0116 17:58:41.032618 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.033271 kubelet[2825]: E0116 17:58:41.032775 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.033271 kubelet[2825]: W0116 17:58:41.032789 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.033271 kubelet[2825]: E0116 17:58:41.032798 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.033271 kubelet[2825]: E0116 17:58:41.032932 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.033271 kubelet[2825]: W0116 17:58:41.032939 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.033271 kubelet[2825]: E0116 17:58:41.032947 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.033271 kubelet[2825]: E0116 17:58:41.033089 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.033271 kubelet[2825]: W0116 17:58:41.033097 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.033271 kubelet[2825]: E0116 17:58:41.033105 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.033613 kubelet[2825]: E0116 17:58:41.033596 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.033613 kubelet[2825]: W0116 17:58:41.033610 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.033689 kubelet[2825]: E0116 17:58:41.033620 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.033943 kubelet[2825]: E0116 17:58:41.033922 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.033943 kubelet[2825]: W0116 17:58:41.033938 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.034029 kubelet[2825]: E0116 17:58:41.033948 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.034753 kubelet[2825]: E0116 17:58:41.034740 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.034846 kubelet[2825]: W0116 17:58:41.034833 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.034900 kubelet[2825]: E0116 17:58:41.034889 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.035785 kubelet[2825]: E0116 17:58:41.035769 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.036495 kubelet[2825]: W0116 17:58:41.036474 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.036637 kubelet[2825]: E0116 17:58:41.036624 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.036964 kubelet[2825]: E0116 17:58:41.036951 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.037051 kubelet[2825]: W0116 17:58:41.037029 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.037101 kubelet[2825]: E0116 17:58:41.037092 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.037408 kubelet[2825]: E0116 17:58:41.037376 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.037408 kubelet[2825]: W0116 17:58:41.037388 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.037408 kubelet[2825]: E0116 17:58:41.037398 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.037780 kubelet[2825]: E0116 17:58:41.037767 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.038208 kubelet[2825]: W0116 17:58:41.038188 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.038379 kubelet[2825]: E0116 17:58:41.038295 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.038690 kubelet[2825]: E0116 17:58:41.038677 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.038798 kubelet[2825]: W0116 17:58:41.038761 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.038798 kubelet[2825]: E0116 17:58:41.038777 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.419000 audit[3310]: NETFILTER_CFG table=filter:115 family=2 entries=22 op=nft_register_rule pid=3310 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:58:41.419000 audit[3310]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffcb0a64b0 a2=0 a3=1 items=0 ppid=2976 pid=3310 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:41.423671 kernel: audit: type=1325 audit(1768586321.419:524): table=filter:115 family=2 entries=22 op=nft_register_rule pid=3310 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:58:41.419000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:58:41.427434 kernel: audit: type=1300 audit(1768586321.419:524): arch=c00000b7 syscall=211 success=yes exit=8224 a0=3 a1=ffffcb0a64b0 a2=0 a3=1 items=0 ppid=2976 pid=3310 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:41.427493 kernel: audit: type=1327 audit(1768586321.419:524): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:58:41.428000 audit[3310]: NETFILTER_CFG table=nat:116 family=2 entries=12 op=nft_register_rule pid=3310 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:58:41.428000 audit[3310]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=2700 a0=3 a1=ffffcb0a64b0 a2=0 a3=1 items=0 ppid=2976 pid=3310 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:41.432967 kernel: audit: type=1325 audit(1768586321.428:525): table=nat:116 family=2 entries=12 op=nft_register_rule pid=3310 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:58:41.428000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:58:41.620585 kubelet[2825]: E0116 17:58:41.620351 2825 configmap.go:193] Couldn't get configMap calico-system/tigera-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Jan 16 17:58:41.620585 kubelet[2825]: E0116 17:58:41.620488 2825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4eb5d9a2-5ef3-410c-b11d-03c24ab02e1d-tigera-ca-bundle podName:4eb5d9a2-5ef3-410c-b11d-03c24ab02e1d nodeName:}" failed. No retries permitted until 2026-01-16 17:58:42.120457382 +0000 UTC m=+33.508373237 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tigera-ca-bundle" (UniqueName: "kubernetes.io/configmap/4eb5d9a2-5ef3-410c-b11d-03c24ab02e1d-tigera-ca-bundle") pod "calico-typha-774699bc97-p7jq9" (UID: "4eb5d9a2-5ef3-410c-b11d-03c24ab02e1d") : failed to sync configmap cache: timed out waiting for the condition Jan 16 17:58:41.621312 kubelet[2825]: E0116 17:58:41.620621 2825 secret.go:189] Couldn't get secret calico-system/typha-certs: failed to sync secret cache: timed out waiting for the condition Jan 16 17:58:41.621312 kubelet[2825]: E0116 17:58:41.620682 2825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4eb5d9a2-5ef3-410c-b11d-03c24ab02e1d-typha-certs podName:4eb5d9a2-5ef3-410c-b11d-03c24ab02e1d nodeName:}" failed. No retries permitted until 2026-01-16 17:58:42.120665956 +0000 UTC m=+33.508581771 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "typha-certs" (UniqueName: "kubernetes.io/secret/4eb5d9a2-5ef3-410c-b11d-03c24ab02e1d-typha-certs") pod "calico-typha-774699bc97-p7jq9" (UID: "4eb5d9a2-5ef3-410c-b11d-03c24ab02e1d") : failed to sync secret cache: timed out waiting for the condition Jan 16 17:58:41.633824 kubelet[2825]: E0116 17:58:41.633781 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.633824 kubelet[2825]: W0116 17:58:41.633809 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.633824 kubelet[2825]: E0116 17:58:41.633833 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.634015 kubelet[2825]: E0116 17:58:41.633998 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.634015 kubelet[2825]: W0116 17:58:41.634011 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.634131 kubelet[2825]: E0116 17:58:41.634021 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.634175 kubelet[2825]: E0116 17:58:41.634129 2825 projected.go:291] Couldn't get configMap calico-system/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jan 16 17:58:41.634175 kubelet[2825]: E0116 17:58:41.634147 2825 projected.go:196] Error preparing data for projected volume kube-api-access-7fpbm for pod calico-system/calico-typha-774699bc97-p7jq9: failed to sync configmap cache: timed out waiting for the condition Jan 16 17:58:41.634243 kubelet[2825]: E0116 17:58:41.634206 2825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4eb5d9a2-5ef3-410c-b11d-03c24ab02e1d-kube-api-access-7fpbm podName:4eb5d9a2-5ef3-410c-b11d-03c24ab02e1d nodeName:}" failed. No retries permitted until 2026-01-16 17:58:42.134189029 +0000 UTC m=+33.522104804 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-7fpbm" (UniqueName: "kubernetes.io/projected/4eb5d9a2-5ef3-410c-b11d-03c24ab02e1d-kube-api-access-7fpbm") pod "calico-typha-774699bc97-p7jq9" (UID: "4eb5d9a2-5ef3-410c-b11d-03c24ab02e1d") : failed to sync configmap cache: timed out waiting for the condition Jan 16 17:58:41.735925 kubelet[2825]: E0116 17:58:41.735867 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.735925 kubelet[2825]: W0116 17:58:41.735899 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.735925 kubelet[2825]: E0116 17:58:41.735919 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.736157 kubelet[2825]: E0116 17:58:41.736131 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.736157 kubelet[2825]: W0116 17:58:41.736148 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.736211 kubelet[2825]: E0116 17:58:41.736158 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.736325 kubelet[2825]: E0116 17:58:41.736299 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.736325 kubelet[2825]: W0116 17:58:41.736314 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.736325 kubelet[2825]: E0116 17:58:41.736323 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.822899 kubelet[2825]: E0116 17:58:41.822791 2825 configmap.go:193] Couldn't get configMap calico-system/tigera-ca-bundle: failed to sync configmap cache: timed out waiting for the condition Jan 16 17:58:41.823138 kubelet[2825]: E0116 17:58:41.822916 2825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/19c043e8-4f90-4e85-af85-45e376d669ab-tigera-ca-bundle podName:19c043e8-4f90-4e85-af85-45e376d669ab nodeName:}" failed. No retries permitted until 2026-01-16 17:58:42.32288812 +0000 UTC m=+33.710803975 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tigera-ca-bundle" (UniqueName: "kubernetes.io/configmap/19c043e8-4f90-4e85-af85-45e376d669ab-tigera-ca-bundle") pod "calico-node-wgssq" (UID: "19c043e8-4f90-4e85-af85-45e376d669ab") : failed to sync configmap cache: timed out waiting for the condition Jan 16 17:58:41.837484 kubelet[2825]: E0116 17:58:41.837444 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.837484 kubelet[2825]: W0116 17:58:41.837476 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.837744 kubelet[2825]: E0116 17:58:41.837503 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.837932 kubelet[2825]: E0116 17:58:41.837891 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.837932 kubelet[2825]: W0116 17:58:41.837914 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.837932 kubelet[2825]: E0116 17:58:41.837931 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.838214 kubelet[2825]: E0116 17:58:41.838191 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.838214 kubelet[2825]: W0116 17:58:41.838208 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.838294 kubelet[2825]: E0116 17:58:41.838222 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.838429 kubelet[2825]: E0116 17:58:41.838415 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.838467 kubelet[2825]: W0116 17:58:41.838429 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.838467 kubelet[2825]: E0116 17:58:41.838442 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.843735 kubelet[2825]: E0116 17:58:41.843569 2825 projected.go:291] Couldn't get configMap calico-system/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Jan 16 17:58:41.843735 kubelet[2825]: E0116 17:58:41.843614 2825 projected.go:196] Error preparing data for projected volume kube-api-access-txmfg for pod calico-system/calico-node-wgssq: failed to sync configmap cache: timed out waiting for the condition Jan 16 17:58:41.843735 kubelet[2825]: E0116 17:58:41.843689 2825 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/19c043e8-4f90-4e85-af85-45e376d669ab-kube-api-access-txmfg podName:19c043e8-4f90-4e85-af85-45e376d669ab nodeName:}" failed. No retries permitted until 2026-01-16 17:58:42.343668362 +0000 UTC m=+33.731584217 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-txmfg" (UniqueName: "kubernetes.io/projected/19c043e8-4f90-4e85-af85-45e376d669ab-kube-api-access-txmfg") pod "calico-node-wgssq" (UID: "19c043e8-4f90-4e85-af85-45e376d669ab") : failed to sync configmap cache: timed out waiting for the condition Jan 16 17:58:41.940081 kubelet[2825]: E0116 17:58:41.939797 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.940081 kubelet[2825]: W0116 17:58:41.939839 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.940081 kubelet[2825]: E0116 17:58:41.939866 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.940719 kubelet[2825]: E0116 17:58:41.940469 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.940719 kubelet[2825]: W0116 17:58:41.940494 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.940719 kubelet[2825]: E0116 17:58:41.940515 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.941008 kubelet[2825]: E0116 17:58:41.940986 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.941235 kubelet[2825]: W0116 17:58:41.941171 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.941359 kubelet[2825]: E0116 17:58:41.941337 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.941866 kubelet[2825]: E0116 17:58:41.941847 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.942074 kubelet[2825]: W0116 17:58:41.941934 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.942074 kubelet[2825]: E0116 17:58:41.941957 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:41.942256 kubelet[2825]: E0116 17:58:41.942242 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:41.942325 kubelet[2825]: W0116 17:58:41.942313 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:41.942383 kubelet[2825]: E0116 17:58:41.942371 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.017889 kubelet[2825]: E0116 17:58:42.017595 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.017889 kubelet[2825]: W0116 17:58:42.017630 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.017889 kubelet[2825]: E0116 17:58:42.017652 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.043435 kubelet[2825]: E0116 17:58:42.043389 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.043435 kubelet[2825]: W0116 17:58:42.043431 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.043658 kubelet[2825]: E0116 17:58:42.043463 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.043835 kubelet[2825]: E0116 17:58:42.043812 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.043892 kubelet[2825]: W0116 17:58:42.043838 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.043892 kubelet[2825]: E0116 17:58:42.043861 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.044177 kubelet[2825]: E0116 17:58:42.044149 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.044177 kubelet[2825]: W0116 17:58:42.044174 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.044288 kubelet[2825]: E0116 17:58:42.044192 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.044482 kubelet[2825]: E0116 17:58:42.044463 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.044523 kubelet[2825]: W0116 17:58:42.044484 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.044523 kubelet[2825]: E0116 17:58:42.044500 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.044916 kubelet[2825]: E0116 17:58:42.044891 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.044957 kubelet[2825]: W0116 17:58:42.044922 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.044957 kubelet[2825]: E0116 17:58:42.044943 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.146420 kubelet[2825]: E0116 17:58:42.146343 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.146420 kubelet[2825]: W0116 17:58:42.146403 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.146420 kubelet[2825]: E0116 17:58:42.146430 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.146780 kubelet[2825]: E0116 17:58:42.146739 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.146780 kubelet[2825]: W0116 17:58:42.146759 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.146780 kubelet[2825]: E0116 17:58:42.146773 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.147108 kubelet[2825]: E0116 17:58:42.147064 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.147108 kubelet[2825]: W0116 17:58:42.147085 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.147108 kubelet[2825]: E0116 17:58:42.147108 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.147380 kubelet[2825]: E0116 17:58:42.147363 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.147380 kubelet[2825]: W0116 17:58:42.147379 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.147449 kubelet[2825]: E0116 17:58:42.147394 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.147651 kubelet[2825]: E0116 17:58:42.147629 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.147651 kubelet[2825]: W0116 17:58:42.147644 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.147739 kubelet[2825]: E0116 17:58:42.147654 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.148027 kubelet[2825]: E0116 17:58:42.148006 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.148027 kubelet[2825]: W0116 17:58:42.148025 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.148152 kubelet[2825]: E0116 17:58:42.148038 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.148267 kubelet[2825]: E0116 17:58:42.148247 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.148307 kubelet[2825]: W0116 17:58:42.148270 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.148307 kubelet[2825]: E0116 17:58:42.148281 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.148473 kubelet[2825]: E0116 17:58:42.148454 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.148473 kubelet[2825]: W0116 17:58:42.148472 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.148540 kubelet[2825]: E0116 17:58:42.148483 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.148713 kubelet[2825]: E0116 17:58:42.148699 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.148713 kubelet[2825]: W0116 17:58:42.148712 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.148795 kubelet[2825]: E0116 17:58:42.148722 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.148953 kubelet[2825]: E0116 17:58:42.148938 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.148953 kubelet[2825]: W0116 17:58:42.148952 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.149001 kubelet[2825]: E0116 17:58:42.148962 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.149340 kubelet[2825]: E0116 17:58:42.149322 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.149380 kubelet[2825]: W0116 17:58:42.149341 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.149380 kubelet[2825]: E0116 17:58:42.149353 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.149703 kubelet[2825]: E0116 17:58:42.149685 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.149703 kubelet[2825]: W0116 17:58:42.149701 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.149776 kubelet[2825]: E0116 17:58:42.149711 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.150267 kubelet[2825]: E0116 17:58:42.150024 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.150267 kubelet[2825]: W0116 17:58:42.150037 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.150267 kubelet[2825]: E0116 17:58:42.150047 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.150267 kubelet[2825]: E0116 17:58:42.150234 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.150267 kubelet[2825]: W0116 17:58:42.150242 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.150267 kubelet[2825]: E0116 17:58:42.150250 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.151022 kubelet[2825]: E0116 17:58:42.150991 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.151022 kubelet[2825]: W0116 17:58:42.151013 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.151022 kubelet[2825]: E0116 17:58:42.151024 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.151583 kubelet[2825]: E0116 17:58:42.151199 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.151583 kubelet[2825]: W0116 17:58:42.151212 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.151583 kubelet[2825]: E0116 17:58:42.151223 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.151583 kubelet[2825]: E0116 17:58:42.151325 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.151583 kubelet[2825]: W0116 17:58:42.151331 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.151583 kubelet[2825]: E0116 17:58:42.151338 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.151583 kubelet[2825]: E0116 17:58:42.151460 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.151583 kubelet[2825]: W0116 17:58:42.151467 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.151583 kubelet[2825]: E0116 17:58:42.151474 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.158950 kubelet[2825]: E0116 17:58:42.158658 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.158950 kubelet[2825]: W0116 17:58:42.158680 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.158950 kubelet[2825]: E0116 17:58:42.158712 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.165078 kubelet[2825]: E0116 17:58:42.165037 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.165078 kubelet[2825]: W0116 17:58:42.165063 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.165241 kubelet[2825]: E0116 17:58:42.165085 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.247384 containerd[1547]: time="2026-01-16T17:58:42.247326030Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-774699bc97-p7jq9,Uid:4eb5d9a2-5ef3-410c-b11d-03c24ab02e1d,Namespace:calico-system,Attempt:0,}" Jan 16 17:58:42.249147 kubelet[2825]: E0116 17:58:42.249071 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.249147 kubelet[2825]: W0116 17:58:42.249093 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.249147 kubelet[2825]: E0116 17:58:42.249121 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.249681 kubelet[2825]: E0116 17:58:42.249561 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.249681 kubelet[2825]: W0116 17:58:42.249577 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.249681 kubelet[2825]: E0116 17:58:42.249588 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.279689 containerd[1547]: time="2026-01-16T17:58:42.279228290Z" level=info msg="connecting to shim 258bfbb3c6dfe6328f16b87e172bdc8c57b95993e9b007f8d45a1dbae1d732e4" address="unix:///run/containerd/s/2c3236c10750397add9d94fadb04f83be1be91c980b46836043e298f5290c44b" namespace=k8s.io protocol=ttrpc version=3 Jan 16 17:58:42.307852 systemd[1]: Started cri-containerd-258bfbb3c6dfe6328f16b87e172bdc8c57b95993e9b007f8d45a1dbae1d732e4.scope - libcontainer container 258bfbb3c6dfe6328f16b87e172bdc8c57b95993e9b007f8d45a1dbae1d732e4. Jan 16 17:58:42.324000 audit: BPF prog-id=149 op=LOAD Jan 16 17:58:42.325000 audit: BPF prog-id=150 op=LOAD Jan 16 17:58:42.325000 audit[3376]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0180 a2=98 a3=0 items=0 ppid=3364 pid=3376 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:42.325000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235386266626233633664666536333238663136623837653137326264 Jan 16 17:58:42.325000 audit: BPF prog-id=150 op=UNLOAD Jan 16 17:58:42.325000 audit[3376]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3364 pid=3376 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:42.325000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235386266626233633664666536333238663136623837653137326264 Jan 16 17:58:42.325000 audit: BPF prog-id=151 op=LOAD Jan 16 17:58:42.325000 audit[3376]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b03e8 a2=98 a3=0 items=0 ppid=3364 pid=3376 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:42.325000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235386266626233633664666536333238663136623837653137326264 Jan 16 17:58:42.325000 audit: BPF prog-id=152 op=LOAD Jan 16 17:58:42.325000 audit[3376]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001b0168 a2=98 a3=0 items=0 ppid=3364 pid=3376 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:42.325000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235386266626233633664666536333238663136623837653137326264 Jan 16 17:58:42.326000 audit: BPF prog-id=152 op=UNLOAD Jan 16 17:58:42.326000 audit[3376]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3364 pid=3376 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:42.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235386266626233633664666536333238663136623837653137326264 Jan 16 17:58:42.326000 audit: BPF prog-id=151 op=UNLOAD Jan 16 17:58:42.326000 audit[3376]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3364 pid=3376 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:42.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235386266626233633664666536333238663136623837653137326264 Jan 16 17:58:42.326000 audit: BPF prog-id=153 op=LOAD Jan 16 17:58:42.326000 audit[3376]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0648 a2=98 a3=0 items=0 ppid=3364 pid=3376 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:42.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3235386266626233633664666536333238663136623837653137326264 Jan 16 17:58:42.352273 kubelet[2825]: E0116 17:58:42.351865 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.352273 kubelet[2825]: W0116 17:58:42.351894 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.352273 kubelet[2825]: E0116 17:58:42.351916 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.352273 kubelet[2825]: E0116 17:58:42.352155 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.352273 kubelet[2825]: W0116 17:58:42.352178 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.352273 kubelet[2825]: E0116 17:58:42.352190 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.352804 kubelet[2825]: E0116 17:58:42.352745 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.352868 kubelet[2825]: W0116 17:58:42.352807 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.352868 kubelet[2825]: E0116 17:58:42.352836 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.353313 kubelet[2825]: E0116 17:58:42.353285 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.353313 kubelet[2825]: W0116 17:58:42.353313 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.353462 kubelet[2825]: E0116 17:58:42.353336 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.353711 kubelet[2825]: E0116 17:58:42.353688 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.353772 kubelet[2825]: W0116 17:58:42.353712 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.353772 kubelet[2825]: E0116 17:58:42.353733 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.355367 kubelet[2825]: E0116 17:58:42.355105 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.355655 kubelet[2825]: W0116 17:58:42.355585 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.355655 kubelet[2825]: E0116 17:58:42.355615 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.356392 kubelet[2825]: E0116 17:58:42.356277 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.356392 kubelet[2825]: W0116 17:58:42.356293 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.356392 kubelet[2825]: E0116 17:58:42.356313 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.357107 kubelet[2825]: E0116 17:58:42.356958 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.357107 kubelet[2825]: W0116 17:58:42.356975 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.357107 kubelet[2825]: E0116 17:58:42.356989 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.358869 kubelet[2825]: E0116 17:58:42.358690 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.358869 kubelet[2825]: W0116 17:58:42.358740 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.359052 kubelet[2825]: E0116 17:58:42.358945 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.360697 kubelet[2825]: E0116 17:58:42.360665 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.360697 kubelet[2825]: W0116 17:58:42.360691 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.360779 kubelet[2825]: E0116 17:58:42.360712 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.362433 containerd[1547]: time="2026-01-16T17:58:42.362378402Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-typha-774699bc97-p7jq9,Uid:4eb5d9a2-5ef3-410c-b11d-03c24ab02e1d,Namespace:calico-system,Attempt:0,} returns sandbox id \"258bfbb3c6dfe6328f16b87e172bdc8c57b95993e9b007f8d45a1dbae1d732e4\"" Jan 16 17:58:42.362855 kubelet[2825]: E0116 17:58:42.362778 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.362855 kubelet[2825]: W0116 17:58:42.362805 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.362855 kubelet[2825]: E0116 17:58:42.362823 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.363930 kubelet[2825]: E0116 17:58:42.363912 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:42.364009 kubelet[2825]: W0116 17:58:42.363996 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:42.364061 kubelet[2825]: E0116 17:58:42.364051 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:42.364965 containerd[1547]: time="2026-01-16T17:58:42.364930090Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\"" Jan 16 17:58:42.474370 containerd[1547]: time="2026-01-16T17:58:42.474289208Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-wgssq,Uid:19c043e8-4f90-4e85-af85-45e376d669ab,Namespace:calico-system,Attempt:0,}" Jan 16 17:58:42.499515 containerd[1547]: time="2026-01-16T17:58:42.499459944Z" level=info msg="connecting to shim ccdee6dbf8eda159d167fd2a6556a0d9f6c3c09501647eb86807e2611968d070" address="unix:///run/containerd/s/384aaf283451a91f809d6de45413e08368e6f6e32978d07e8953510a97ee1e85" namespace=k8s.io protocol=ttrpc version=3 Jan 16 17:58:42.528977 systemd[1]: Started cri-containerd-ccdee6dbf8eda159d167fd2a6556a0d9f6c3c09501647eb86807e2611968d070.scope - libcontainer container ccdee6dbf8eda159d167fd2a6556a0d9f6c3c09501647eb86807e2611968d070. Jan 16 17:58:42.546000 audit: BPF prog-id=154 op=LOAD Jan 16 17:58:42.547000 audit: BPF prog-id=155 op=LOAD Jan 16 17:58:42.547000 audit[3434]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=3423 pid=3434 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:42.547000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6363646565366462663865646131353964313637666432613635353661 Jan 16 17:58:42.547000 audit: BPF prog-id=155 op=UNLOAD Jan 16 17:58:42.547000 audit[3434]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3423 pid=3434 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:42.547000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6363646565366462663865646131353964313637666432613635353661 Jan 16 17:58:42.548000 audit: BPF prog-id=156 op=LOAD Jan 16 17:58:42.548000 audit[3434]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3423 pid=3434 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:42.548000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6363646565366462663865646131353964313637666432613635353661 Jan 16 17:58:42.548000 audit: BPF prog-id=157 op=LOAD Jan 16 17:58:42.548000 audit[3434]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3423 pid=3434 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:42.548000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6363646565366462663865646131353964313637666432613635353661 Jan 16 17:58:42.548000 audit: BPF prog-id=157 op=UNLOAD Jan 16 17:58:42.548000 audit[3434]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3423 pid=3434 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:42.548000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6363646565366462663865646131353964313637666432613635353661 Jan 16 17:58:42.548000 audit: BPF prog-id=156 op=UNLOAD Jan 16 17:58:42.548000 audit[3434]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3423 pid=3434 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:42.548000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6363646565366462663865646131353964313637666432613635353661 Jan 16 17:58:42.548000 audit: BPF prog-id=158 op=LOAD Jan 16 17:58:42.548000 audit[3434]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=3423 pid=3434 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:42.548000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6363646565366462663865646131353964313637666432613635353661 Jan 16 17:58:42.566423 containerd[1547]: time="2026-01-16T17:58:42.566365948Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-node-wgssq,Uid:19c043e8-4f90-4e85-af85-45e376d669ab,Namespace:calico-system,Attempt:0,} returns sandbox id \"ccdee6dbf8eda159d167fd2a6556a0d9f6c3c09501647eb86807e2611968d070\"" Jan 16 17:58:42.749229 kubelet[2825]: E0116 17:58:42.749195 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lqf7c" podUID="8f14bccb-f353-467f-b549-674ae9114a0e" Jan 16 17:58:44.017592 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3986106878.mount: Deactivated successfully. Jan 16 17:58:44.749691 kubelet[2825]: E0116 17:58:44.749059 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lqf7c" podUID="8f14bccb-f353-467f-b549-674ae9114a0e" Jan 16 17:58:44.795574 containerd[1547]: time="2026-01-16T17:58:44.795082616Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:58:44.797180 containerd[1547]: time="2026-01-16T17:58:44.797129985Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/typha:v3.30.4: active requests=0, bytes read=31716861" Jan 16 17:58:44.798103 containerd[1547]: time="2026-01-16T17:58:44.798077204Z" level=info msg="ImageCreate event name:\"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:58:44.801581 containerd[1547]: time="2026-01-16T17:58:44.801399533Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:58:44.802730 containerd[1547]: time="2026-01-16T17:58:44.802685894Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/typha:v3.30.4\" with image id \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\", repo tag \"ghcr.io/flatcar/calico/typha:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/typha@sha256:6f437220b5b3c627fb4a0fc8dc323363101f3c22a8f337612c2a1ddfb73b810c\", size \"33090541\" in 2.437715161s" Jan 16 17:58:44.802854 containerd[1547]: time="2026-01-16T17:58:44.802829943Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/typha:v3.30.4\" returns image reference \"sha256:5fe38d12a54098df5aaf5ec7228dc2f976f60cb4f434d7256f03126b004fdc5b\"" Jan 16 17:58:44.804403 containerd[1547]: time="2026-01-16T17:58:44.804352959Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\"" Jan 16 17:58:44.820024 containerd[1547]: time="2026-01-16T17:58:44.819982940Z" level=info msg="CreateContainer within sandbox \"258bfbb3c6dfe6328f16b87e172bdc8c57b95993e9b007f8d45a1dbae1d732e4\" for container &ContainerMetadata{Name:calico-typha,Attempt:0,}" Jan 16 17:58:44.829768 containerd[1547]: time="2026-01-16T17:58:44.829722032Z" level=info msg="Container fc370bc988e65e54cdce4e72648cb5c66341a14d713d0a24f5406ebab6e7aff2: CDI devices from CRI Config.CDIDevices: []" Jan 16 17:58:44.843073 containerd[1547]: time="2026-01-16T17:58:44.842999426Z" level=info msg="CreateContainer within sandbox \"258bfbb3c6dfe6328f16b87e172bdc8c57b95993e9b007f8d45a1dbae1d732e4\" for &ContainerMetadata{Name:calico-typha,Attempt:0,} returns container id \"fc370bc988e65e54cdce4e72648cb5c66341a14d713d0a24f5406ebab6e7aff2\"" Jan 16 17:58:44.843895 containerd[1547]: time="2026-01-16T17:58:44.843863640Z" level=info msg="StartContainer for \"fc370bc988e65e54cdce4e72648cb5c66341a14d713d0a24f5406ebab6e7aff2\"" Jan 16 17:58:44.845594 containerd[1547]: time="2026-01-16T17:58:44.845503063Z" level=info msg="connecting to shim fc370bc988e65e54cdce4e72648cb5c66341a14d713d0a24f5406ebab6e7aff2" address="unix:///run/containerd/s/2c3236c10750397add9d94fadb04f83be1be91c980b46836043e298f5290c44b" protocol=ttrpc version=3 Jan 16 17:58:44.869965 systemd[1]: Started cri-containerd-fc370bc988e65e54cdce4e72648cb5c66341a14d713d0a24f5406ebab6e7aff2.scope - libcontainer container fc370bc988e65e54cdce4e72648cb5c66341a14d713d0a24f5406ebab6e7aff2. Jan 16 17:58:44.885000 audit: BPF prog-id=159 op=LOAD Jan 16 17:58:44.886000 audit: BPF prog-id=160 op=LOAD Jan 16 17:58:44.886000 audit[3472]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=3364 pid=3472 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:44.886000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663333730626339383865363565353463646365346537323634386362 Jan 16 17:58:44.886000 audit: BPF prog-id=160 op=UNLOAD Jan 16 17:58:44.886000 audit[3472]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3364 pid=3472 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:44.886000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663333730626339383865363565353463646365346537323634386362 Jan 16 17:58:44.886000 audit: BPF prog-id=161 op=LOAD Jan 16 17:58:44.886000 audit[3472]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=3364 pid=3472 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:44.886000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663333730626339383865363565353463646365346537323634386362 Jan 16 17:58:44.886000 audit: BPF prog-id=162 op=LOAD Jan 16 17:58:44.886000 audit[3472]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=3364 pid=3472 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:44.886000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663333730626339383865363565353463646365346537323634386362 Jan 16 17:58:44.886000 audit: BPF prog-id=162 op=UNLOAD Jan 16 17:58:44.886000 audit[3472]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3364 pid=3472 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:44.886000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663333730626339383865363565353463646365346537323634386362 Jan 16 17:58:44.886000 audit: BPF prog-id=161 op=UNLOAD Jan 16 17:58:44.886000 audit[3472]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3364 pid=3472 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:44.886000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663333730626339383865363565353463646365346537323634386362 Jan 16 17:58:44.886000 audit: BPF prog-id=163 op=LOAD Jan 16 17:58:44.886000 audit[3472]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=3364 pid=3472 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:44.886000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6663333730626339383865363565353463646365346537323634386362 Jan 16 17:58:44.919076 containerd[1547]: time="2026-01-16T17:58:44.919034682Z" level=info msg="StartContainer for \"fc370bc988e65e54cdce4e72648cb5c66341a14d713d0a24f5406ebab6e7aff2\" returns successfully" Jan 16 17:58:45.942015 kubelet[2825]: I0116 17:58:45.941933 2825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-typha-774699bc97-p7jq9" podStartSLOduration=3.502326606 podStartE2EDuration="5.941909166s" podCreationTimestamp="2026-01-16 17:58:40 +0000 UTC" firstStartedPulling="2026-01-16 17:58:42.364351932 +0000 UTC m=+33.752267747" lastFinishedPulling="2026-01-16 17:58:44.803934452 +0000 UTC m=+36.191850307" observedRunningTime="2026-01-16 17:58:45.927850222 +0000 UTC m=+37.315766077" watchObservedRunningTime="2026-01-16 17:58:45.941909166 +0000 UTC m=+37.329824981" Jan 16 17:58:45.953077 kubelet[2825]: E0116 17:58:45.953006 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.953077 kubelet[2825]: W0116 17:58:45.953034 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.953077 kubelet[2825]: E0116 17:58:45.953055 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.954000 kubelet[2825]: E0116 17:58:45.953947 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.954161 kubelet[2825]: W0116 17:58:45.953965 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.954161 kubelet[2825]: E0116 17:58:45.954114 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.954899 kubelet[2825]: E0116 17:58:45.954882 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.954899 kubelet[2825]: W0116 17:58:45.954950 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.954899 kubelet[2825]: E0116 17:58:45.954966 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.955816 kubelet[2825]: E0116 17:58:45.955748 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.955816 kubelet[2825]: W0116 17:58:45.955765 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.955816 kubelet[2825]: E0116 17:58:45.955778 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.956181 kubelet[2825]: E0116 17:58:45.956162 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.956373 kubelet[2825]: W0116 17:58:45.956240 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.956373 kubelet[2825]: E0116 17:58:45.956256 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.956661 kubelet[2825]: E0116 17:58:45.956642 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.956900 kubelet[2825]: W0116 17:58:45.956827 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.956900 kubelet[2825]: E0116 17:58:45.956845 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.957565 kubelet[2825]: E0116 17:58:45.957379 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.957565 kubelet[2825]: W0116 17:58:45.957507 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.957565 kubelet[2825]: E0116 17:58:45.957520 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.958222 kubelet[2825]: E0116 17:58:45.958099 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.958222 kubelet[2825]: W0116 17:58:45.958115 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.958222 kubelet[2825]: E0116 17:58:45.958126 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.958847 kubelet[2825]: E0116 17:58:45.958776 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.958847 kubelet[2825]: W0116 17:58:45.958791 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.958847 kubelet[2825]: E0116 17:58:45.958802 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.959493 kubelet[2825]: E0116 17:58:45.959379 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.959493 kubelet[2825]: W0116 17:58:45.959393 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.959493 kubelet[2825]: E0116 17:58:45.959404 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.960028 kubelet[2825]: E0116 17:58:45.959870 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.960028 kubelet[2825]: W0116 17:58:45.959882 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.960028 kubelet[2825]: E0116 17:58:45.959895 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.960574 kubelet[2825]: E0116 17:58:45.960388 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.960574 kubelet[2825]: W0116 17:58:45.960419 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.960574 kubelet[2825]: E0116 17:58:45.960433 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.960889 kubelet[2825]: E0116 17:58:45.960866 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.961098 kubelet[2825]: W0116 17:58:45.960974 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.961098 kubelet[2825]: E0116 17:58:45.960990 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.961433 kubelet[2825]: E0116 17:58:45.961369 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.961433 kubelet[2825]: W0116 17:58:45.961381 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.961433 kubelet[2825]: E0116 17:58:45.961391 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.961941 kubelet[2825]: E0116 17:58:45.961775 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.961941 kubelet[2825]: W0116 17:58:45.961787 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.961941 kubelet[2825]: E0116 17:58:45.961797 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.967000 audit[3532]: NETFILTER_CFG table=filter:117 family=2 entries=21 op=nft_register_rule pid=3532 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:58:45.969679 kernel: kauditd_printk_skb: 68 callbacks suppressed Jan 16 17:58:45.969765 kernel: audit: type=1325 audit(1768586325.967:550): table=filter:117 family=2 entries=21 op=nft_register_rule pid=3532 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:58:45.967000 audit[3532]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffc6189640 a2=0 a3=1 items=0 ppid=2976 pid=3532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:45.967000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:58:45.974423 kernel: audit: type=1300 audit(1768586325.967:550): arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffc6189640 a2=0 a3=1 items=0 ppid=2976 pid=3532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:45.974582 kernel: audit: type=1327 audit(1768586325.967:550): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:58:45.972000 audit[3532]: NETFILTER_CFG table=nat:118 family=2 entries=19 op=nft_register_chain pid=3532 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:58:45.976403 kernel: audit: type=1325 audit(1768586325.972:551): table=nat:118 family=2 entries=19 op=nft_register_chain pid=3532 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:58:45.982188 kernel: audit: type=1300 audit(1768586325.972:551): arch=c00000b7 syscall=211 success=yes exit=6276 a0=3 a1=ffffc6189640 a2=0 a3=1 items=0 ppid=2976 pid=3532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:45.972000 audit[3532]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6276 a0=3 a1=ffffc6189640 a2=0 a3=1 items=0 ppid=2976 pid=3532 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:45.972000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:58:45.984621 kernel: audit: type=1327 audit(1768586325.972:551): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:58:45.985345 kubelet[2825]: E0116 17:58:45.985182 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.985345 kubelet[2825]: W0116 17:58:45.985207 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.985345 kubelet[2825]: E0116 17:58:45.985242 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.986175 kubelet[2825]: E0116 17:58:45.986056 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.986175 kubelet[2825]: W0116 17:58:45.986073 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.986175 kubelet[2825]: E0116 17:58:45.986111 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.986654 kubelet[2825]: E0116 17:58:45.986638 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.986743 kubelet[2825]: W0116 17:58:45.986729 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.986878 kubelet[2825]: E0116 17:58:45.986809 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.987482 kubelet[2825]: E0116 17:58:45.987450 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.987740 kubelet[2825]: W0116 17:58:45.987521 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.987740 kubelet[2825]: E0116 17:58:45.987539 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.987878 kubelet[2825]: E0116 17:58:45.987858 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.987878 kubelet[2825]: W0116 17:58:45.987873 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.987944 kubelet[2825]: E0116 17:58:45.987885 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.988153 kubelet[2825]: E0116 17:58:45.988137 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.988153 kubelet[2825]: W0116 17:58:45.988152 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.988333 kubelet[2825]: E0116 17:58:45.988163 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.988630 kubelet[2825]: E0116 17:58:45.988608 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.988674 kubelet[2825]: W0116 17:58:45.988640 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.988674 kubelet[2825]: E0116 17:58:45.988654 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.988853 kubelet[2825]: E0116 17:58:45.988816 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.988853 kubelet[2825]: W0116 17:58:45.988825 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.988853 kubelet[2825]: E0116 17:58:45.988833 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.989112 kubelet[2825]: E0116 17:58:45.989100 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.989194 kubelet[2825]: W0116 17:58:45.989114 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.989194 kubelet[2825]: E0116 17:58:45.989125 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.989378 kubelet[2825]: E0116 17:58:45.989363 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.989378 kubelet[2825]: W0116 17:58:45.989377 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.989439 kubelet[2825]: E0116 17:58:45.989387 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.989645 kubelet[2825]: E0116 17:58:45.989631 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.989645 kubelet[2825]: W0116 17:58:45.989645 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.989714 kubelet[2825]: E0116 17:58:45.989655 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.989867 kubelet[2825]: E0116 17:58:45.989849 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.989867 kubelet[2825]: W0116 17:58:45.989866 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.989944 kubelet[2825]: E0116 17:58:45.989878 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.990305 kubelet[2825]: E0116 17:58:45.990285 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.990395 kubelet[2825]: W0116 17:58:45.990381 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.990468 kubelet[2825]: E0116 17:58:45.990457 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.990822 kubelet[2825]: E0116 17:58:45.990792 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.990984 kubelet[2825]: W0116 17:58:45.990911 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.990984 kubelet[2825]: E0116 17:58:45.990930 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.991337 kubelet[2825]: E0116 17:58:45.991180 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.991337 kubelet[2825]: W0116 17:58:45.991193 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.991337 kubelet[2825]: E0116 17:58:45.991208 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.991516 kubelet[2825]: E0116 17:58:45.991502 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.991612 kubelet[2825]: W0116 17:58:45.991599 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.991663 kubelet[2825]: E0116 17:58:45.991653 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.991904 kubelet[2825]: E0116 17:58:45.991891 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.992066 kubelet[2825]: W0116 17:58:45.991958 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.992066 kubelet[2825]: E0116 17:58:45.991973 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:45.992269 kubelet[2825]: E0116 17:58:45.992256 2825 driver-call.go:262] Failed to unmarshal output for command: init, output: "", error: unexpected end of JSON input Jan 16 17:58:45.992375 kubelet[2825]: W0116 17:58:45.992360 2825 driver-call.go:149] FlexVolume: driver call failed: executable: /opt/libexec/kubernetes/kubelet-plugins/volume/exec/nodeagent~uds/uds, args: [init], error: executable file not found in $PATH, output: "" Jan 16 17:58:45.992433 kubelet[2825]: E0116 17:58:45.992422 2825 plugins.go:697] "Error dynamically probing plugins" err="error creating Flexvolume plugin from directory nodeagent~uds, skipping. Error: unexpected end of JSON input" Jan 16 17:58:46.443849 containerd[1547]: time="2026-01-16T17:58:46.443760677Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:58:46.444932 containerd[1547]: time="2026-01-16T17:58:46.444707214Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4: active requests=0, bytes read=4262566" Jan 16 17:58:46.446756 containerd[1547]: time="2026-01-16T17:58:46.446702494Z" level=info msg="ImageCreate event name:\"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:58:46.450834 containerd[1547]: time="2026-01-16T17:58:46.450509003Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:58:46.452715 containerd[1547]: time="2026-01-16T17:58:46.452005973Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" with image id \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\", repo tag \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/pod2daemon-flexvol@sha256:50bdfe370b7308fa9957ed1eaccd094aa4f27f9a4f1dfcfef2f8a7696a1551e1\", size \"5636392\" in 1.647601291s" Jan 16 17:58:46.452715 containerd[1547]: time="2026-01-16T17:58:46.452057816Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/pod2daemon-flexvol:v3.30.4\" returns image reference \"sha256:90ff755393144dc5a3c05f95ffe1a3ecd2f89b98ecf36d9e4721471b80af4640\"" Jan 16 17:58:46.458622 containerd[1547]: time="2026-01-16T17:58:46.458425439Z" level=info msg="CreateContainer within sandbox \"ccdee6dbf8eda159d167fd2a6556a0d9f6c3c09501647eb86807e2611968d070\" for container &ContainerMetadata{Name:flexvol-driver,Attempt:0,}" Jan 16 17:58:46.470460 containerd[1547]: time="2026-01-16T17:58:46.468810624Z" level=info msg="Container dbb339adc1e8e55a1735aa210fca76a2c0e2becaec6b5cf3ec65b5cc10b0a7ee: CDI devices from CRI Config.CDIDevices: []" Jan 16 17:58:46.479777 containerd[1547]: time="2026-01-16T17:58:46.479739162Z" level=info msg="CreateContainer within sandbox \"ccdee6dbf8eda159d167fd2a6556a0d9f6c3c09501647eb86807e2611968d070\" for &ContainerMetadata{Name:flexvol-driver,Attempt:0,} returns container id \"dbb339adc1e8e55a1735aa210fca76a2c0e2becaec6b5cf3ec65b5cc10b0a7ee\"" Jan 16 17:58:46.480405 containerd[1547]: time="2026-01-16T17:58:46.480379080Z" level=info msg="StartContainer for \"dbb339adc1e8e55a1735aa210fca76a2c0e2becaec6b5cf3ec65b5cc10b0a7ee\"" Jan 16 17:58:46.483106 containerd[1547]: time="2026-01-16T17:58:46.483078523Z" level=info msg="connecting to shim dbb339adc1e8e55a1735aa210fca76a2c0e2becaec6b5cf3ec65b5cc10b0a7ee" address="unix:///run/containerd/s/384aaf283451a91f809d6de45413e08368e6f6e32978d07e8953510a97ee1e85" protocol=ttrpc version=3 Jan 16 17:58:46.507788 systemd[1]: Started cri-containerd-dbb339adc1e8e55a1735aa210fca76a2c0e2becaec6b5cf3ec65b5cc10b0a7ee.scope - libcontainer container dbb339adc1e8e55a1735aa210fca76a2c0e2becaec6b5cf3ec65b5cc10b0a7ee. Jan 16 17:58:46.573581 kernel: audit: type=1334 audit(1768586326.569:552): prog-id=164 op=LOAD Jan 16 17:58:46.573654 kernel: audit: type=1300 audit(1768586326.569:552): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3423 pid=3555 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:46.569000 audit: BPF prog-id=164 op=LOAD Jan 16 17:58:46.569000 audit[3555]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=3423 pid=3555 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:46.569000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6462623333396164633165386535356131373335616132313066636137 Jan 16 17:58:46.576085 kernel: audit: type=1327 audit(1768586326.569:552): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6462623333396164633165386535356131373335616132313066636137 Jan 16 17:58:46.570000 audit: BPF prog-id=165 op=LOAD Jan 16 17:58:46.577065 kernel: audit: type=1334 audit(1768586326.570:553): prog-id=165 op=LOAD Jan 16 17:58:46.570000 audit[3555]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=3423 pid=3555 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:46.570000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6462623333396164633165386535356131373335616132313066636137 Jan 16 17:58:46.570000 audit: BPF prog-id=165 op=UNLOAD Jan 16 17:58:46.570000 audit[3555]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3423 pid=3555 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:46.570000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6462623333396164633165386535356131373335616132313066636137 Jan 16 17:58:46.570000 audit: BPF prog-id=164 op=UNLOAD Jan 16 17:58:46.570000 audit[3555]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3423 pid=3555 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:46.570000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6462623333396164633165386535356131373335616132313066636137 Jan 16 17:58:46.570000 audit: BPF prog-id=166 op=LOAD Jan 16 17:58:46.570000 audit[3555]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=3423 pid=3555 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:46.570000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6462623333396164633165386535356131373335616132313066636137 Jan 16 17:58:46.601303 containerd[1547]: time="2026-01-16T17:58:46.601194670Z" level=info msg="StartContainer for \"dbb339adc1e8e55a1735aa210fca76a2c0e2becaec6b5cf3ec65b5cc10b0a7ee\" returns successfully" Jan 16 17:58:46.618050 systemd[1]: cri-containerd-dbb339adc1e8e55a1735aa210fca76a2c0e2becaec6b5cf3ec65b5cc10b0a7ee.scope: Deactivated successfully. Jan 16 17:58:46.619000 audit: BPF prog-id=166 op=UNLOAD Jan 16 17:58:46.623121 containerd[1547]: time="2026-01-16T17:58:46.623080506Z" level=info msg="received container exit event container_id:\"dbb339adc1e8e55a1735aa210fca76a2c0e2becaec6b5cf3ec65b5cc10b0a7ee\" id:\"dbb339adc1e8e55a1735aa210fca76a2c0e2becaec6b5cf3ec65b5cc10b0a7ee\" pid:3568 exited_at:{seconds:1768586326 nanos:622617079}" Jan 16 17:58:46.645120 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-dbb339adc1e8e55a1735aa210fca76a2c0e2becaec6b5cf3ec65b5cc10b0a7ee-rootfs.mount: Deactivated successfully. Jan 16 17:58:46.749124 kubelet[2825]: E0116 17:58:46.748990 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lqf7c" podUID="8f14bccb-f353-467f-b549-674ae9114a0e" Jan 16 17:58:46.921543 containerd[1547]: time="2026-01-16T17:58:46.921499702Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\"" Jan 16 17:58:48.747503 kubelet[2825]: E0116 17:58:48.746884 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lqf7c" podUID="8f14bccb-f353-467f-b549-674ae9114a0e" Jan 16 17:58:50.493085 containerd[1547]: time="2026-01-16T17:58:50.492993716Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:58:50.494807 containerd[1547]: time="2026-01-16T17:58:50.494726853Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/cni:v3.30.4: active requests=0, bytes read=65921248" Jan 16 17:58:50.495761 containerd[1547]: time="2026-01-16T17:58:50.495717748Z" level=info msg="ImageCreate event name:\"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:58:50.499217 containerd[1547]: time="2026-01-16T17:58:50.499145939Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:58:50.499882 containerd[1547]: time="2026-01-16T17:58:50.499836938Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/cni:v3.30.4\" with image id \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\", repo tag \"ghcr.io/flatcar/calico/cni:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/cni@sha256:273501a9cfbd848ade2b6a8452dfafdd3adb4f9bf9aec45c398a5d19b8026627\", size \"67295507\" in 3.578296393s" Jan 16 17:58:50.499882 containerd[1547]: time="2026-01-16T17:58:50.499870780Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/cni:v3.30.4\" returns image reference \"sha256:e60d442b6496497355efdf45eaa3ea72f5a2b28a5187aeab33442933f3c735d2\"" Jan 16 17:58:50.506007 containerd[1547]: time="2026-01-16T17:58:50.505965800Z" level=info msg="CreateContainer within sandbox \"ccdee6dbf8eda159d167fd2a6556a0d9f6c3c09501647eb86807e2611968d070\" for container &ContainerMetadata{Name:install-cni,Attempt:0,}" Jan 16 17:58:50.521603 containerd[1547]: time="2026-01-16T17:58:50.520096308Z" level=info msg="Container 7a8a9a87ef81fcfbcb8a939b6c2be03b0bf0263c8ff35728a3ffb61f62f706f2: CDI devices from CRI Config.CDIDevices: []" Jan 16 17:58:50.532911 containerd[1547]: time="2026-01-16T17:58:50.532848620Z" level=info msg="CreateContainer within sandbox \"ccdee6dbf8eda159d167fd2a6556a0d9f6c3c09501647eb86807e2611968d070\" for &ContainerMetadata{Name:install-cni,Attempt:0,} returns container id \"7a8a9a87ef81fcfbcb8a939b6c2be03b0bf0263c8ff35728a3ffb61f62f706f2\"" Jan 16 17:58:50.534649 containerd[1547]: time="2026-01-16T17:58:50.533684347Z" level=info msg="StartContainer for \"7a8a9a87ef81fcfbcb8a939b6c2be03b0bf0263c8ff35728a3ffb61f62f706f2\"" Jan 16 17:58:50.536503 containerd[1547]: time="2026-01-16T17:58:50.536464382Z" level=info msg="connecting to shim 7a8a9a87ef81fcfbcb8a939b6c2be03b0bf0263c8ff35728a3ffb61f62f706f2" address="unix:///run/containerd/s/384aaf283451a91f809d6de45413e08368e6f6e32978d07e8953510a97ee1e85" protocol=ttrpc version=3 Jan 16 17:58:50.566026 systemd[1]: Started cri-containerd-7a8a9a87ef81fcfbcb8a939b6c2be03b0bf0263c8ff35728a3ffb61f62f706f2.scope - libcontainer container 7a8a9a87ef81fcfbcb8a939b6c2be03b0bf0263c8ff35728a3ffb61f62f706f2. Jan 16 17:58:50.620000 audit: BPF prog-id=167 op=LOAD Jan 16 17:58:50.620000 audit[3613]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=3423 pid=3613 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:50.620000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761386139613837656638316663666263623861393339623663326265 Jan 16 17:58:50.621000 audit: BPF prog-id=168 op=LOAD Jan 16 17:58:50.621000 audit[3613]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=3423 pid=3613 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:50.621000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761386139613837656638316663666263623861393339623663326265 Jan 16 17:58:50.621000 audit: BPF prog-id=168 op=UNLOAD Jan 16 17:58:50.621000 audit[3613]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3423 pid=3613 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:50.621000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761386139613837656638316663666263623861393339623663326265 Jan 16 17:58:50.621000 audit: BPF prog-id=167 op=UNLOAD Jan 16 17:58:50.621000 audit[3613]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3423 pid=3613 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:50.621000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761386139613837656638316663666263623861393339623663326265 Jan 16 17:58:50.621000 audit: BPF prog-id=169 op=LOAD Jan 16 17:58:50.621000 audit[3613]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=3423 pid=3613 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:50.621000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3761386139613837656638316663666263623861393339623663326265 Jan 16 17:58:50.645622 containerd[1547]: time="2026-01-16T17:58:50.645512388Z" level=info msg="StartContainer for \"7a8a9a87ef81fcfbcb8a939b6c2be03b0bf0263c8ff35728a3ffb61f62f706f2\" returns successfully" Jan 16 17:58:50.747438 kubelet[2825]: E0116 17:58:50.747277 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: cni plugin not initialized" pod="calico-system/csi-node-driver-lqf7c" podUID="8f14bccb-f353-467f-b549-674ae9114a0e" Jan 16 17:58:51.141282 containerd[1547]: time="2026-01-16T17:58:51.140948347Z" level=error msg="failed to reload cni configuration after receiving fs change event(WRITE \"/etc/cni/net.d/calico-kubeconfig\")" error="cni config load failed: no network config found in /etc/cni/net.d: cni plugin not initialized: failed to load cni config" Jan 16 17:58:51.143921 systemd[1]: cri-containerd-7a8a9a87ef81fcfbcb8a939b6c2be03b0bf0263c8ff35728a3ffb61f62f706f2.scope: Deactivated successfully. Jan 16 17:58:51.146647 containerd[1547]: time="2026-01-16T17:58:51.146289360Z" level=info msg="received container exit event container_id:\"7a8a9a87ef81fcfbcb8a939b6c2be03b0bf0263c8ff35728a3ffb61f62f706f2\" id:\"7a8a9a87ef81fcfbcb8a939b6c2be03b0bf0263c8ff35728a3ffb61f62f706f2\" pid:3627 exited_at:{seconds:1768586331 nanos:146028666}" Jan 16 17:58:51.146780 systemd[1]: cri-containerd-7a8a9a87ef81fcfbcb8a939b6c2be03b0bf0263c8ff35728a3ffb61f62f706f2.scope: Consumed 503ms CPU time, 184.3M memory peak, 165.9M written to disk. Jan 16 17:58:51.148890 kernel: kauditd_printk_skb: 27 callbacks suppressed Jan 16 17:58:51.148972 kernel: audit: type=1334 audit(1768586331.147:563): prog-id=169 op=UNLOAD Jan 16 17:58:51.147000 audit: BPF prog-id=169 op=UNLOAD Jan 16 17:58:51.172297 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-7a8a9a87ef81fcfbcb8a939b6c2be03b0bf0263c8ff35728a3ffb61f62f706f2-rootfs.mount: Deactivated successfully. Jan 16 17:58:51.194641 kubelet[2825]: I0116 17:58:51.194275 2825 kubelet_node_status.go:439] "Fast updating node status as it just became ready" Jan 16 17:58:51.269951 systemd[1]: Created slice kubepods-burstable-pode6a77609_327b_44be_be8e_e3bd4ce03cac.slice - libcontainer container kubepods-burstable-pode6a77609_327b_44be_be8e_e3bd4ce03cac.slice. Jan 16 17:58:51.283711 systemd[1]: Created slice kubepods-burstable-pod42aeda7d_4fe5_4a6e_a477_419675cf7435.slice - libcontainer container kubepods-burstable-pod42aeda7d_4fe5_4a6e_a477_419675cf7435.slice. Jan 16 17:58:51.293582 systemd[1]: Created slice kubepods-besteffort-pode2b5c1b0_147f_4d63_9d4a_0d66468ad158.slice - libcontainer container kubepods-besteffort-pode2b5c1b0_147f_4d63_9d4a_0d66468ad158.slice. Jan 16 17:58:51.300903 systemd[1]: Created slice kubepods-besteffort-podc9307e86_6ef4_4f5e_8c8d_f9f21c28f28a.slice - libcontainer container kubepods-besteffort-podc9307e86_6ef4_4f5e_8c8d_f9f21c28f28a.slice. Jan 16 17:58:51.310712 systemd[1]: Created slice kubepods-besteffort-pod2eb1812e_29cb_4b14_8060_fe2f9a701433.slice - libcontainer container kubepods-besteffort-pod2eb1812e_29cb_4b14_8060_fe2f9a701433.slice. Jan 16 17:58:51.317510 systemd[1]: Created slice kubepods-besteffort-podd584925c_30a0_4760_b407_e5a8a40d9f3d.slice - libcontainer container kubepods-besteffort-podd584925c_30a0_4760_b407_e5a8a40d9f3d.slice. Jan 16 17:58:51.326629 systemd[1]: Created slice kubepods-besteffort-pod4cc0644f_445f_4df1_845a_17b848af3c7f.slice - libcontainer container kubepods-besteffort-pod4cc0644f_445f_4df1_845a_17b848af3c7f.slice. Jan 16 17:58:51.328578 kubelet[2825]: I0116 17:58:51.328341 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/2eb1812e-29cb-4b14-8060-fe2f9a701433-config\") pod \"goldmane-7c778bb748-pqdht\" (UID: \"2eb1812e-29cb-4b14-8060-fe2f9a701433\") " pod="calico-system/goldmane-7c778bb748-pqdht" Jan 16 17:58:51.328578 kubelet[2825]: I0116 17:58:51.328379 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/2eb1812e-29cb-4b14-8060-fe2f9a701433-goldmane-ca-bundle\") pod \"goldmane-7c778bb748-pqdht\" (UID: \"2eb1812e-29cb-4b14-8060-fe2f9a701433\") " pod="calico-system/goldmane-7c778bb748-pqdht" Jan 16 17:58:51.328578 kubelet[2825]: I0116 17:58:51.328399 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c6x28\" (UniqueName: \"kubernetes.io/projected/d584925c-30a0-4760-b407-e5a8a40d9f3d-kube-api-access-c6x28\") pod \"calico-apiserver-767f75bcf4-xk99t\" (UID: \"d584925c-30a0-4760-b407-e5a8a40d9f3d\") " pod="calico-apiserver/calico-apiserver-767f75bcf4-xk99t" Jan 16 17:58:51.328578 kubelet[2825]: I0116 17:58:51.328415 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-krtjq\" (UniqueName: \"kubernetes.io/projected/e6a77609-327b-44be-be8e-e3bd4ce03cac-kube-api-access-krtjq\") pod \"coredns-66bc5c9577-gshmq\" (UID: \"e6a77609-327b-44be-be8e-e3bd4ce03cac\") " pod="kube-system/coredns-66bc5c9577-gshmq" Jan 16 17:58:51.328578 kubelet[2825]: I0116 17:58:51.328438 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7rht\" (UniqueName: \"kubernetes.io/projected/2eb1812e-29cb-4b14-8060-fe2f9a701433-kube-api-access-b7rht\") pod \"goldmane-7c778bb748-pqdht\" (UID: \"2eb1812e-29cb-4b14-8060-fe2f9a701433\") " pod="calico-system/goldmane-7c778bb748-pqdht" Jan 16 17:58:51.328813 kubelet[2825]: I0116 17:58:51.328452 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/e2b5c1b0-147f-4d63-9d4a-0d66468ad158-calico-apiserver-certs\") pod \"calico-apiserver-767f75bcf4-k622k\" (UID: \"e2b5c1b0-147f-4d63-9d4a-0d66468ad158\") " pod="calico-apiserver/calico-apiserver-767f75bcf4-k622k" Jan 16 17:58:51.328813 kubelet[2825]: I0116 17:58:51.328467 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5v59m\" (UniqueName: \"kubernetes.io/projected/e2b5c1b0-147f-4d63-9d4a-0d66468ad158-kube-api-access-5v59m\") pod \"calico-apiserver-767f75bcf4-k622k\" (UID: \"e2b5c1b0-147f-4d63-9d4a-0d66468ad158\") " pod="calico-apiserver/calico-apiserver-767f75bcf4-k622k" Jan 16 17:58:51.328813 kubelet[2825]: I0116 17:58:51.328485 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckk4l\" (UniqueName: \"kubernetes.io/projected/c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a-kube-api-access-ckk4l\") pod \"calico-kube-controllers-94bfc95c-f6gcl\" (UID: \"c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a\") " pod="calico-system/calico-kube-controllers-94bfc95c-f6gcl" Jan 16 17:58:51.328813 kubelet[2825]: I0116 17:58:51.328503 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h4jl6\" (UniqueName: \"kubernetes.io/projected/42aeda7d-4fe5-4a6e-a477-419675cf7435-kube-api-access-h4jl6\") pod \"coredns-66bc5c9577-xpvnz\" (UID: \"42aeda7d-4fe5-4a6e-a477-419675cf7435\") " pod="kube-system/coredns-66bc5c9577-xpvnz" Jan 16 17:58:51.328813 kubelet[2825]: I0116 17:58:51.328520 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/4cc0644f-445f-4df1-845a-17b848af3c7f-whisker-backend-key-pair\") pod \"whisker-74d4754fdf-vtkh7\" (UID: \"4cc0644f-445f-4df1-845a-17b848af3c7f\") " pod="calico-system/whisker-74d4754fdf-vtkh7" Jan 16 17:58:51.328916 kubelet[2825]: I0116 17:58:51.328534 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4cc0644f-445f-4df1-845a-17b848af3c7f-whisker-ca-bundle\") pod \"whisker-74d4754fdf-vtkh7\" (UID: \"4cc0644f-445f-4df1-845a-17b848af3c7f\") " pod="calico-system/whisker-74d4754fdf-vtkh7" Jan 16 17:58:51.328975 kubelet[2825]: I0116 17:58:51.328960 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f6pb\" (UniqueName: \"kubernetes.io/projected/4cc0644f-445f-4df1-845a-17b848af3c7f-kube-api-access-4f6pb\") pod \"whisker-74d4754fdf-vtkh7\" (UID: \"4cc0644f-445f-4df1-845a-17b848af3c7f\") " pod="calico-system/whisker-74d4754fdf-vtkh7" Jan 16 17:58:51.329044 kubelet[2825]: I0116 17:58:51.329032 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"goldmane-key-pair\" (UniqueName: \"kubernetes.io/secret/2eb1812e-29cb-4b14-8060-fe2f9a701433-goldmane-key-pair\") pod \"goldmane-7c778bb748-pqdht\" (UID: \"2eb1812e-29cb-4b14-8060-fe2f9a701433\") " pod="calico-system/goldmane-7c778bb748-pqdht" Jan 16 17:58:51.329106 kubelet[2825]: I0116 17:58:51.329094 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"calico-apiserver-certs\" (UniqueName: \"kubernetes.io/secret/d584925c-30a0-4760-b407-e5a8a40d9f3d-calico-apiserver-certs\") pod \"calico-apiserver-767f75bcf4-xk99t\" (UID: \"d584925c-30a0-4760-b407-e5a8a40d9f3d\") " pod="calico-apiserver/calico-apiserver-767f75bcf4-xk99t" Jan 16 17:58:51.329170 kubelet[2825]: I0116 17:58:51.329159 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e6a77609-327b-44be-be8e-e3bd4ce03cac-config-volume\") pod \"coredns-66bc5c9577-gshmq\" (UID: \"e6a77609-327b-44be-be8e-e3bd4ce03cac\") " pod="kube-system/coredns-66bc5c9577-gshmq" Jan 16 17:58:51.329233 kubelet[2825]: I0116 17:58:51.329221 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tigera-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a-tigera-ca-bundle\") pod \"calico-kube-controllers-94bfc95c-f6gcl\" (UID: \"c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a\") " pod="calico-system/calico-kube-controllers-94bfc95c-f6gcl" Jan 16 17:58:51.329302 kubelet[2825]: I0116 17:58:51.329290 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/42aeda7d-4fe5-4a6e-a477-419675cf7435-config-volume\") pod \"coredns-66bc5c9577-xpvnz\" (UID: \"42aeda7d-4fe5-4a6e-a477-419675cf7435\") " pod="kube-system/coredns-66bc5c9577-xpvnz" Jan 16 17:58:51.579814 containerd[1547]: time="2026-01-16T17:58:51.579526137Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-gshmq,Uid:e6a77609-327b-44be-be8e-e3bd4ce03cac,Namespace:kube-system,Attempt:0,}" Jan 16 17:58:51.590533 containerd[1547]: time="2026-01-16T17:58:51.590471498Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-xpvnz,Uid:42aeda7d-4fe5-4a6e-a477-419675cf7435,Namespace:kube-system,Attempt:0,}" Jan 16 17:58:51.601232 containerd[1547]: time="2026-01-16T17:58:51.601197887Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-767f75bcf4-k622k,Uid:e2b5c1b0-147f-4d63-9d4a-0d66468ad158,Namespace:calico-apiserver,Attempt:0,}" Jan 16 17:58:51.609943 containerd[1547]: time="2026-01-16T17:58:51.609902244Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-94bfc95c-f6gcl,Uid:c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a,Namespace:calico-system,Attempt:0,}" Jan 16 17:58:51.618176 containerd[1547]: time="2026-01-16T17:58:51.617855361Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-pqdht,Uid:2eb1812e-29cb-4b14-8060-fe2f9a701433,Namespace:calico-system,Attempt:0,}" Jan 16 17:58:51.627743 containerd[1547]: time="2026-01-16T17:58:51.627700661Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-767f75bcf4-xk99t,Uid:d584925c-30a0-4760-b407-e5a8a40d9f3d,Namespace:calico-apiserver,Attempt:0,}" Jan 16 17:58:51.633016 containerd[1547]: time="2026-01-16T17:58:51.632811822Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-74d4754fdf-vtkh7,Uid:4cc0644f-445f-4df1-845a-17b848af3c7f,Namespace:calico-system,Attempt:0,}" Jan 16 17:58:51.743988 containerd[1547]: time="2026-01-16T17:58:51.743942961Z" level=error msg="Failed to destroy network for sandbox \"edba8d367967101bf57f16575054f92f37bb99de94f14e0041a36a2ea3b53247\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 17:58:51.748146 containerd[1547]: time="2026-01-16T17:58:51.748073108Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-74d4754fdf-vtkh7,Uid:4cc0644f-445f-4df1-845a-17b848af3c7f,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"edba8d367967101bf57f16575054f92f37bb99de94f14e0041a36a2ea3b53247\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 17:58:51.748628 kubelet[2825]: E0116 17:58:51.748565 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"edba8d367967101bf57f16575054f92f37bb99de94f14e0041a36a2ea3b53247\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 17:58:51.749024 kubelet[2825]: E0116 17:58:51.748723 2825 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"edba8d367967101bf57f16575054f92f37bb99de94f14e0041a36a2ea3b53247\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-74d4754fdf-vtkh7" Jan 16 17:58:51.749024 kubelet[2825]: E0116 17:58:51.748748 2825 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"edba8d367967101bf57f16575054f92f37bb99de94f14e0041a36a2ea3b53247\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/whisker-74d4754fdf-vtkh7" Jan 16 17:58:51.749641 kubelet[2825]: E0116 17:58:51.749197 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"whisker-74d4754fdf-vtkh7_calico-system(4cc0644f-445f-4df1-845a-17b848af3c7f)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"whisker-74d4754fdf-vtkh7_calico-system(4cc0644f-445f-4df1-845a-17b848af3c7f)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"edba8d367967101bf57f16575054f92f37bb99de94f14e0041a36a2ea3b53247\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/whisker-74d4754fdf-vtkh7" podUID="4cc0644f-445f-4df1-845a-17b848af3c7f" Jan 16 17:58:51.759752 containerd[1547]: time="2026-01-16T17:58:51.759679025Z" level=error msg="Failed to destroy network for sandbox \"d213e4879802a6639e63ab96dd94f6a3d86d5acb84b0d9e804beb9dc58c96058\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 17:58:51.763794 containerd[1547]: time="2026-01-16T17:58:51.763673164Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-xpvnz,Uid:42aeda7d-4fe5-4a6e-a477-419675cf7435,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"d213e4879802a6639e63ab96dd94f6a3d86d5acb84b0d9e804beb9dc58c96058\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 17:58:51.764010 kubelet[2825]: E0116 17:58:51.763934 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d213e4879802a6639e63ab96dd94f6a3d86d5acb84b0d9e804beb9dc58c96058\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 17:58:51.764010 kubelet[2825]: E0116 17:58:51.763991 2825 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d213e4879802a6639e63ab96dd94f6a3d86d5acb84b0d9e804beb9dc58c96058\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-xpvnz" Jan 16 17:58:51.764181 kubelet[2825]: E0116 17:58:51.764010 2825 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"d213e4879802a6639e63ab96dd94f6a3d86d5acb84b0d9e804beb9dc58c96058\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-xpvnz" Jan 16 17:58:51.764181 kubelet[2825]: E0116 17:58:51.764065 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-xpvnz_kube-system(42aeda7d-4fe5-4a6e-a477-419675cf7435)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-xpvnz_kube-system(42aeda7d-4fe5-4a6e-a477-419675cf7435)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"d213e4879802a6639e63ab96dd94f6a3d86d5acb84b0d9e804beb9dc58c96058\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-xpvnz" podUID="42aeda7d-4fe5-4a6e-a477-419675cf7435" Jan 16 17:58:51.803072 containerd[1547]: time="2026-01-16T17:58:51.803010323Z" level=error msg="Failed to destroy network for sandbox \"6a52033b5a00df4ccea289d3d553ccaa0bca820f6708b778136c28eb642acd1d\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 17:58:51.807508 containerd[1547]: time="2026-01-16T17:58:51.807432085Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-gshmq,Uid:e6a77609-327b-44be-be8e-e3bd4ce03cac,Namespace:kube-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a52033b5a00df4ccea289d3d553ccaa0bca820f6708b778136c28eb642acd1d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 17:58:51.808985 kubelet[2825]: E0116 17:58:51.807698 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a52033b5a00df4ccea289d3d553ccaa0bca820f6708b778136c28eb642acd1d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 17:58:51.808985 kubelet[2825]: E0116 17:58:51.807750 2825 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a52033b5a00df4ccea289d3d553ccaa0bca820f6708b778136c28eb642acd1d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-gshmq" Jan 16 17:58:51.808985 kubelet[2825]: E0116 17:58:51.807769 2825 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6a52033b5a00df4ccea289d3d553ccaa0bca820f6708b778136c28eb642acd1d\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="kube-system/coredns-66bc5c9577-gshmq" Jan 16 17:58:51.809113 kubelet[2825]: E0116 17:58:51.807815 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"coredns-66bc5c9577-gshmq_kube-system(e6a77609-327b-44be-be8e-e3bd4ce03cac)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"coredns-66bc5c9577-gshmq_kube-system(e6a77609-327b-44be-be8e-e3bd4ce03cac)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6a52033b5a00df4ccea289d3d553ccaa0bca820f6708b778136c28eb642acd1d\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="kube-system/coredns-66bc5c9577-gshmq" podUID="e6a77609-327b-44be-be8e-e3bd4ce03cac" Jan 16 17:58:51.812060 containerd[1547]: time="2026-01-16T17:58:51.811948293Z" level=error msg="Failed to destroy network for sandbox \"dcfd1ab018e2e07a4aaa3ff8bc475e0f2105e6e086182a6f6c659577451b04ba\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 17:58:51.816885 containerd[1547]: time="2026-01-16T17:58:51.816832041Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-767f75bcf4-k622k,Uid:e2b5c1b0-147f-4d63-9d4a-0d66468ad158,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"dcfd1ab018e2e07a4aaa3ff8bc475e0f2105e6e086182a6f6c659577451b04ba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 17:58:51.817248 kubelet[2825]: E0116 17:58:51.817052 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dcfd1ab018e2e07a4aaa3ff8bc475e0f2105e6e086182a6f6c659577451b04ba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 17:58:51.817248 kubelet[2825]: E0116 17:58:51.817100 2825 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dcfd1ab018e2e07a4aaa3ff8bc475e0f2105e6e086182a6f6c659577451b04ba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-767f75bcf4-k622k" Jan 16 17:58:51.817248 kubelet[2825]: E0116 17:58:51.817123 2825 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"dcfd1ab018e2e07a4aaa3ff8bc475e0f2105e6e086182a6f6c659577451b04ba\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-767f75bcf4-k622k" Jan 16 17:58:51.818634 kubelet[2825]: E0116 17:58:51.817166 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-767f75bcf4-k622k_calico-apiserver(e2b5c1b0-147f-4d63-9d4a-0d66468ad158)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-767f75bcf4-k622k_calico-apiserver(e2b5c1b0-147f-4d63-9d4a-0d66468ad158)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"dcfd1ab018e2e07a4aaa3ff8bc475e0f2105e6e086182a6f6c659577451b04ba\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-k622k" podUID="e2b5c1b0-147f-4d63-9d4a-0d66468ad158" Jan 16 17:58:51.819080 containerd[1547]: time="2026-01-16T17:58:51.818885434Z" level=error msg="Failed to destroy network for sandbox \"cd594480d35885d674dd5abb99cca3592ea0e8f854ad8c801c96e450552aba48\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 17:58:51.821141 containerd[1547]: time="2026-01-16T17:58:51.821109316Z" level=error msg="Failed to destroy network for sandbox \"9427e0b449cead1378edb48ff6f19e8f5de724b4f06f617cff1c2a3f6cf4befb\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 17:58:51.821501 containerd[1547]: time="2026-01-16T17:58:51.821462655Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-94bfc95c-f6gcl,Uid:c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"cd594480d35885d674dd5abb99cca3592ea0e8f854ad8c801c96e450552aba48\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 17:58:51.821759 kubelet[2825]: E0116 17:58:51.821722 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cd594480d35885d674dd5abb99cca3592ea0e8f854ad8c801c96e450552aba48\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 17:58:51.822049 kubelet[2825]: E0116 17:58:51.821776 2825 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cd594480d35885d674dd5abb99cca3592ea0e8f854ad8c801c96e450552aba48\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-94bfc95c-f6gcl" Jan 16 17:58:51.822049 kubelet[2825]: E0116 17:58:51.821798 2825 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"cd594480d35885d674dd5abb99cca3592ea0e8f854ad8c801c96e450552aba48\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/calico-kube-controllers-94bfc95c-f6gcl" Jan 16 17:58:51.822694 kubelet[2825]: E0116 17:58:51.822330 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-kube-controllers-94bfc95c-f6gcl_calico-system(c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-kube-controllers-94bfc95c-f6gcl_calico-system(c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"cd594480d35885d674dd5abb99cca3592ea0e8f854ad8c801c96e450552aba48\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/calico-kube-controllers-94bfc95c-f6gcl" podUID="c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a" Jan 16 17:58:51.824343 containerd[1547]: time="2026-01-16T17:58:51.824137282Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-pqdht,Uid:2eb1812e-29cb-4b14-8060-fe2f9a701433,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"9427e0b449cead1378edb48ff6f19e8f5de724b4f06f617cff1c2a3f6cf4befb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 17:58:51.825203 kubelet[2825]: E0116 17:58:51.825175 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9427e0b449cead1378edb48ff6f19e8f5de724b4f06f617cff1c2a3f6cf4befb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 17:58:51.825302 kubelet[2825]: E0116 17:58:51.825286 2825 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9427e0b449cead1378edb48ff6f19e8f5de724b4f06f617cff1c2a3f6cf4befb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-pqdht" Jan 16 17:58:51.825448 kubelet[2825]: E0116 17:58:51.825359 2825 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"9427e0b449cead1378edb48ff6f19e8f5de724b4f06f617cff1c2a3f6cf4befb\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/goldmane-7c778bb748-pqdht" Jan 16 17:58:51.825634 kubelet[2825]: E0116 17:58:51.825415 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"goldmane-7c778bb748-pqdht_calico-system(2eb1812e-29cb-4b14-8060-fe2f9a701433)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"goldmane-7c778bb748-pqdht_calico-system(2eb1812e-29cb-4b14-8060-fe2f9a701433)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"9427e0b449cead1378edb48ff6f19e8f5de724b4f06f617cff1c2a3f6cf4befb\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/goldmane-7c778bb748-pqdht" podUID="2eb1812e-29cb-4b14-8060-fe2f9a701433" Jan 16 17:58:51.833240 containerd[1547]: time="2026-01-16T17:58:51.833041571Z" level=error msg="Failed to destroy network for sandbox \"1121337d8caf903df1b34fdb01665aa070c45f8558df595826765010dea6dee8\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 17:58:51.835917 containerd[1547]: time="2026-01-16T17:58:51.835863566Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-767f75bcf4-xk99t,Uid:d584925c-30a0-4760-b407-e5a8a40d9f3d,Namespace:calico-apiserver,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"1121337d8caf903df1b34fdb01665aa070c45f8558df595826765010dea6dee8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 17:58:51.837885 kubelet[2825]: E0116 17:58:51.837575 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1121337d8caf903df1b34fdb01665aa070c45f8558df595826765010dea6dee8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 17:58:51.838073 kubelet[2825]: E0116 17:58:51.837968 2825 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1121337d8caf903df1b34fdb01665aa070c45f8558df595826765010dea6dee8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-767f75bcf4-xk99t" Jan 16 17:58:51.838194 kubelet[2825]: E0116 17:58:51.837992 2825 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"1121337d8caf903df1b34fdb01665aa070c45f8558df595826765010dea6dee8\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-apiserver/calico-apiserver-767f75bcf4-xk99t" Jan 16 17:58:51.838279 kubelet[2825]: E0116 17:58:51.838177 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"calico-apiserver-767f75bcf4-xk99t_calico-apiserver(d584925c-30a0-4760-b407-e5a8a40d9f3d)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"calico-apiserver-767f75bcf4-xk99t_calico-apiserver(d584925c-30a0-4760-b407-e5a8a40d9f3d)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"1121337d8caf903df1b34fdb01665aa070c45f8558df595826765010dea6dee8\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-xk99t" podUID="d584925c-30a0-4760-b407-e5a8a40d9f3d" Jan 16 17:58:51.947773 containerd[1547]: time="2026-01-16T17:58:51.946904340Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\"" Jan 16 17:58:52.518926 systemd[1]: run-netns-cni\x2d74e5aa45\x2de225\x2d03a8\x2db1d5\x2d438ed2dd76f9.mount: Deactivated successfully. Jan 16 17:58:52.519097 systemd[1]: run-netns-cni\x2de7b82c52\x2dec12\x2ded7a\x2d50cc\x2d77c7913a61d9.mount: Deactivated successfully. Jan 16 17:58:52.519191 systemd[1]: run-netns-cni\x2d29feefd9\x2d01c9\x2dfd5a\x2d242d\x2d1689f3183be6.mount: Deactivated successfully. Jan 16 17:58:52.519277 systemd[1]: run-netns-cni\x2d6a1c9435\x2d7682\x2ddb58\x2da069\x2da572d34e3afe.mount: Deactivated successfully. Jan 16 17:58:52.755191 systemd[1]: Created slice kubepods-besteffort-pod8f14bccb_f353_467f_b549_674ae9114a0e.slice - libcontainer container kubepods-besteffort-pod8f14bccb_f353_467f_b549_674ae9114a0e.slice. Jan 16 17:58:52.759080 containerd[1547]: time="2026-01-16T17:58:52.759034414Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-lqf7c,Uid:8f14bccb-f353-467f-b549-674ae9114a0e,Namespace:calico-system,Attempt:0,}" Jan 16 17:58:52.818475 containerd[1547]: time="2026-01-16T17:58:52.818191569Z" level=error msg="Failed to destroy network for sandbox \"6f7aa6fbd5ab5a7e5f25857b3c6795d71185cbe508fd01d265906de15283ab1e\"" error="plugin type=\"calico\" failed (delete): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 17:58:52.821623 systemd[1]: run-netns-cni\x2d3be87795\x2d3cae\x2d3be7\x2d3660\x2d9c5aab6bd0de.mount: Deactivated successfully. Jan 16 17:58:52.823524 containerd[1547]: time="2026-01-16T17:58:52.823444733Z" level=error msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-lqf7c,Uid:8f14bccb-f353-467f-b549-674ae9114a0e,Namespace:calico-system,Attempt:0,} failed, error" error="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f7aa6fbd5ab5a7e5f25857b3c6795d71185cbe508fd01d265906de15283ab1e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 17:58:52.823879 kubelet[2825]: E0116 17:58:52.823837 2825 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f7aa6fbd5ab5a7e5f25857b3c6795d71185cbe508fd01d265906de15283ab1e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" Jan 16 17:58:52.824586 kubelet[2825]: E0116 17:58:52.824246 2825 kuberuntime_sandbox.go:71] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f7aa6fbd5ab5a7e5f25857b3c6795d71185cbe508fd01d265906de15283ab1e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-lqf7c" Jan 16 17:58:52.824586 kubelet[2825]: E0116 17:58:52.824289 2825 kuberuntime_manager.go:1343] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to setup network for sandbox \"6f7aa6fbd5ab5a7e5f25857b3c6795d71185cbe508fd01d265906de15283ab1e\": plugin type=\"calico\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/" pod="calico-system/csi-node-driver-lqf7c" Jan 16 17:58:52.824586 kubelet[2825]: E0116 17:58:52.824360 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"csi-node-driver-lqf7c_calico-system(8f14bccb-f353-467f-b549-674ae9114a0e)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"csi-node-driver-lqf7c_calico-system(8f14bccb-f353-467f-b549-674ae9114a0e)\\\": rpc error: code = Unknown desc = failed to setup network for sandbox \\\"6f7aa6fbd5ab5a7e5f25857b3c6795d71185cbe508fd01d265906de15283ab1e\\\": plugin type=\\\"calico\\\" failed (add): stat /var/lib/calico/nodename: no such file or directory: check that the calico/node container is running and has mounted /var/lib/calico/\"" pod="calico-system/csi-node-driver-lqf7c" podUID="8f14bccb-f353-467f-b549-674ae9114a0e" Jan 16 17:58:59.393072 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2016226582.mount: Deactivated successfully. Jan 16 17:58:59.417659 containerd[1547]: time="2026-01-16T17:58:59.417532775Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node:v3.30.4\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:58:59.418650 containerd[1547]: time="2026-01-16T17:58:59.418524424Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node:v3.30.4: active requests=0, bytes read=150930912" Jan 16 17:58:59.420324 containerd[1547]: time="2026-01-16T17:58:59.419409028Z" level=info msg="ImageCreate event name:\"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:58:59.423747 containerd[1547]: time="2026-01-16T17:58:59.423708480Z" level=info msg="ImageCreate event name:\"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\" labels:{key:\"io.cri-containerd.image\" value:\"managed\"}" Jan 16 17:58:59.424919 containerd[1547]: time="2026-01-16T17:58:59.424863217Z" level=info msg="Pulled image \"ghcr.io/flatcar/calico/node:v3.30.4\" with image id \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\", repo tag \"ghcr.io/flatcar/calico/node:v3.30.4\", repo digest \"ghcr.io/flatcar/calico/node@sha256:e92cca333202c87d07bf57f38182fd68f0779f912ef55305eda1fccc9f33667c\", size \"150934424\" in 7.477891113s" Jan 16 17:58:59.424919 containerd[1547]: time="2026-01-16T17:58:59.424906579Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node:v3.30.4\" returns image reference \"sha256:43a5290057a103af76996c108856f92ed902f34573d7a864f55f15b8aaf4683b\"" Jan 16 17:58:59.448386 containerd[1547]: time="2026-01-16T17:58:59.447877751Z" level=info msg="CreateContainer within sandbox \"ccdee6dbf8eda159d167fd2a6556a0d9f6c3c09501647eb86807e2611968d070\" for container &ContainerMetadata{Name:calico-node,Attempt:0,}" Jan 16 17:58:59.460989 containerd[1547]: time="2026-01-16T17:58:59.460721664Z" level=info msg="Container 02c919678d4a9ff13db6335893d5d05d5e643b031390dba92b23146261de07d6: CDI devices from CRI Config.CDIDevices: []" Jan 16 17:58:59.464015 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount2177800508.mount: Deactivated successfully. Jan 16 17:58:59.480420 containerd[1547]: time="2026-01-16T17:58:59.480317470Z" level=info msg="CreateContainer within sandbox \"ccdee6dbf8eda159d167fd2a6556a0d9f6c3c09501647eb86807e2611968d070\" for &ContainerMetadata{Name:calico-node,Attempt:0,} returns container id \"02c919678d4a9ff13db6335893d5d05d5e643b031390dba92b23146261de07d6\"" Jan 16 17:58:59.483109 containerd[1547]: time="2026-01-16T17:58:59.482936919Z" level=info msg="StartContainer for \"02c919678d4a9ff13db6335893d5d05d5e643b031390dba92b23146261de07d6\"" Jan 16 17:58:59.487352 containerd[1547]: time="2026-01-16T17:58:59.487313655Z" level=info msg="connecting to shim 02c919678d4a9ff13db6335893d5d05d5e643b031390dba92b23146261de07d6" address="unix:///run/containerd/s/384aaf283451a91f809d6de45413e08368e6f6e32978d07e8953510a97ee1e85" protocol=ttrpc version=3 Jan 16 17:58:59.509758 systemd[1]: Started cri-containerd-02c919678d4a9ff13db6335893d5d05d5e643b031390dba92b23146261de07d6.scope - libcontainer container 02c919678d4a9ff13db6335893d5d05d5e643b031390dba92b23146261de07d6. Jan 16 17:58:59.569000 audit: BPF prog-id=170 op=LOAD Jan 16 17:58:59.569000 audit[3887]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=3423 pid=3887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:59.574131 kernel: audit: type=1334 audit(1768586339.569:564): prog-id=170 op=LOAD Jan 16 17:58:59.574192 kernel: audit: type=1300 audit(1768586339.569:564): arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=3423 pid=3887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:59.574228 kernel: audit: type=1327 audit(1768586339.569:564): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3032633931393637386434613966663133646236333335383933643564 Jan 16 17:58:59.569000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3032633931393637386434613966663133646236333335383933643564 Jan 16 17:58:59.569000 audit: BPF prog-id=171 op=LOAD Jan 16 17:58:59.569000 audit[3887]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=3423 pid=3887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:59.579305 kernel: audit: type=1334 audit(1768586339.569:565): prog-id=171 op=LOAD Jan 16 17:58:59.579474 kernel: audit: type=1300 audit(1768586339.569:565): arch=c00000b7 syscall=280 success=yes exit=22 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=3423 pid=3887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:59.569000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3032633931393637386434613966663133646236333335383933643564 Jan 16 17:58:59.581614 kernel: audit: type=1327 audit(1768586339.569:565): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3032633931393637386434613966663133646236333335383933643564 Jan 16 17:58:59.569000 audit: BPF prog-id=171 op=UNLOAD Jan 16 17:58:59.582591 kernel: audit: type=1334 audit(1768586339.569:566): prog-id=171 op=UNLOAD Jan 16 17:58:59.582652 kernel: audit: type=1300 audit(1768586339.569:566): arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3423 pid=3887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:59.569000 audit[3887]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=16 a1=0 a2=0 a3=0 items=0 ppid=3423 pid=3887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:59.569000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3032633931393637386434613966663133646236333335383933643564 Jan 16 17:58:59.586905 kernel: audit: type=1327 audit(1768586339.569:566): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3032633931393637386434613966663133646236333335383933643564 Jan 16 17:58:59.569000 audit: BPF prog-id=170 op=UNLOAD Jan 16 17:58:59.587828 kernel: audit: type=1334 audit(1768586339.569:567): prog-id=170 op=UNLOAD Jan 16 17:58:59.569000 audit[3887]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=14 a1=0 a2=0 a3=0 items=0 ppid=3423 pid=3887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:59.569000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3032633931393637386434613966663133646236333335383933643564 Jan 16 17:58:59.569000 audit: BPF prog-id=172 op=LOAD Jan 16 17:58:59.569000 audit[3887]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=20 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=3423 pid=3887 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:58:59.569000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3032633931393637386434613966663133646236333335383933643564 Jan 16 17:58:59.609009 containerd[1547]: time="2026-01-16T17:58:59.608880766Z" level=info msg="StartContainer for \"02c919678d4a9ff13db6335893d5d05d5e643b031390dba92b23146261de07d6\" returns successfully" Jan 16 17:58:59.767769 kernel: wireguard: WireGuard 1.0.0 loaded. See www.wireguard.com for information. Jan 16 17:58:59.767898 kernel: wireguard: Copyright (C) 2015-2019 Jason A. Donenfeld . All Rights Reserved. Jan 16 17:59:00.002305 kubelet[2825]: I0116 17:59:00.000942 2825 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/4cc0644f-445f-4df1-845a-17b848af3c7f-whisker-backend-key-pair\") pod \"4cc0644f-445f-4df1-845a-17b848af3c7f\" (UID: \"4cc0644f-445f-4df1-845a-17b848af3c7f\") " Jan 16 17:59:00.002305 kubelet[2825]: I0116 17:59:00.001026 2825 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4f6pb\" (UniqueName: \"kubernetes.io/projected/4cc0644f-445f-4df1-845a-17b848af3c7f-kube-api-access-4f6pb\") pod \"4cc0644f-445f-4df1-845a-17b848af3c7f\" (UID: \"4cc0644f-445f-4df1-845a-17b848af3c7f\") " Jan 16 17:59:00.002305 kubelet[2825]: I0116 17:59:00.001051 2825 reconciler_common.go:163] "operationExecutor.UnmountVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4cc0644f-445f-4df1-845a-17b848af3c7f-whisker-ca-bundle\") pod \"4cc0644f-445f-4df1-845a-17b848af3c7f\" (UID: \"4cc0644f-445f-4df1-845a-17b848af3c7f\") " Jan 16 17:59:00.007725 kubelet[2825]: I0116 17:59:00.007641 2825 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4cc0644f-445f-4df1-845a-17b848af3c7f-whisker-ca-bundle" (OuterVolumeSpecName: "whisker-ca-bundle") pod "4cc0644f-445f-4df1-845a-17b848af3c7f" (UID: "4cc0644f-445f-4df1-845a-17b848af3c7f"). InnerVolumeSpecName "whisker-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Jan 16 17:59:00.009111 kubelet[2825]: I0116 17:59:00.009066 2825 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4cc0644f-445f-4df1-845a-17b848af3c7f-whisker-backend-key-pair" (OuterVolumeSpecName: "whisker-backend-key-pair") pod "4cc0644f-445f-4df1-845a-17b848af3c7f" (UID: "4cc0644f-445f-4df1-845a-17b848af3c7f"). InnerVolumeSpecName "whisker-backend-key-pair". PluginName "kubernetes.io/secret", VolumeGIDValue "" Jan 16 17:59:00.010573 kubelet[2825]: I0116 17:59:00.009447 2825 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4cc0644f-445f-4df1-845a-17b848af3c7f-kube-api-access-4f6pb" (OuterVolumeSpecName: "kube-api-access-4f6pb") pod "4cc0644f-445f-4df1-845a-17b848af3c7f" (UID: "4cc0644f-445f-4df1-845a-17b848af3c7f"). InnerVolumeSpecName "kube-api-access-4f6pb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Jan 16 17:59:00.102518 kubelet[2825]: I0116 17:59:00.102365 2825 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4f6pb\" (UniqueName: \"kubernetes.io/projected/4cc0644f-445f-4df1-845a-17b848af3c7f-kube-api-access-4f6pb\") on node \"ci-4580-0-0-p-03fd9ab712\" DevicePath \"\"" Jan 16 17:59:00.102518 kubelet[2825]: I0116 17:59:00.102396 2825 reconciler_common.go:299] "Volume detached for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/4cc0644f-445f-4df1-845a-17b848af3c7f-whisker-ca-bundle\") on node \"ci-4580-0-0-p-03fd9ab712\" DevicePath \"\"" Jan 16 17:59:00.102518 kubelet[2825]: I0116 17:59:00.102418 2825 reconciler_common.go:299] "Volume detached for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/4cc0644f-445f-4df1-845a-17b848af3c7f-whisker-backend-key-pair\") on node \"ci-4580-0-0-p-03fd9ab712\" DevicePath \"\"" Jan 16 17:59:00.301939 systemd[1]: Removed slice kubepods-besteffort-pod4cc0644f_445f_4df1_845a_17b848af3c7f.slice - libcontainer container kubepods-besteffort-pod4cc0644f_445f_4df1_845a_17b848af3c7f.slice. Jan 16 17:59:00.322536 kubelet[2825]: I0116 17:59:00.322436 2825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="calico-system/calico-node-wgssq" podStartSLOduration=3.464519236 podStartE2EDuration="20.322412526s" podCreationTimestamp="2026-01-16 17:58:40 +0000 UTC" firstStartedPulling="2026-01-16 17:58:42.568078141 +0000 UTC m=+33.955993956" lastFinishedPulling="2026-01-16 17:58:59.425971431 +0000 UTC m=+50.813887246" observedRunningTime="2026-01-16 17:59:00.029912101 +0000 UTC m=+51.417827916" watchObservedRunningTime="2026-01-16 17:59:00.322412526 +0000 UTC m=+51.710328341" Jan 16 17:59:00.379506 systemd[1]: Created slice kubepods-besteffort-pod78e68302_7ac2_49f4_8488_479ee8eaf4c9.slice - libcontainer container kubepods-besteffort-pod78e68302_7ac2_49f4_8488_479ee8eaf4c9.slice. Jan 16 17:59:00.398187 systemd[1]: var-lib-kubelet-pods-4cc0644f\x2d445f\x2d4df1\x2d845a\x2d17b848af3c7f-volumes-kubernetes.io\x7eprojected-kube\x2dapi\x2daccess\x2d4f6pb.mount: Deactivated successfully. Jan 16 17:59:00.398283 systemd[1]: var-lib-kubelet-pods-4cc0644f\x2d445f\x2d4df1\x2d845a\x2d17b848af3c7f-volumes-kubernetes.io\x7esecret-whisker\x2dbackend\x2dkey\x2dpair.mount: Deactivated successfully. Jan 16 17:59:00.506537 kubelet[2825]: I0116 17:59:00.506479 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/78e68302-7ac2-49f4-8488-479ee8eaf4c9-whisker-ca-bundle\") pod \"whisker-67547c9f8f-2pf9b\" (UID: \"78e68302-7ac2-49f4-8488-479ee8eaf4c9\") " pod="calico-system/whisker-67547c9f8f-2pf9b" Jan 16 17:59:00.506948 kubelet[2825]: I0116 17:59:00.506895 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m4tn5\" (UniqueName: \"kubernetes.io/projected/78e68302-7ac2-49f4-8488-479ee8eaf4c9-kube-api-access-m4tn5\") pod \"whisker-67547c9f8f-2pf9b\" (UID: \"78e68302-7ac2-49f4-8488-479ee8eaf4c9\") " pod="calico-system/whisker-67547c9f8f-2pf9b" Jan 16 17:59:00.507276 kubelet[2825]: I0116 17:59:00.507229 2825 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whisker-backend-key-pair\" (UniqueName: \"kubernetes.io/secret/78e68302-7ac2-49f4-8488-479ee8eaf4c9-whisker-backend-key-pair\") pod \"whisker-67547c9f8f-2pf9b\" (UID: \"78e68302-7ac2-49f4-8488-479ee8eaf4c9\") " pod="calico-system/whisker-67547c9f8f-2pf9b" Jan 16 17:59:00.688442 containerd[1547]: time="2026-01-16T17:59:00.688390293Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-67547c9f8f-2pf9b,Uid:78e68302-7ac2-49f4-8488-479ee8eaf4c9,Namespace:calico-system,Attempt:0,}" Jan 16 17:59:00.749685 kubelet[2825]: I0116 17:59:00.749653 2825 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4cc0644f-445f-4df1-845a-17b848af3c7f" path="/var/lib/kubelet/pods/4cc0644f-445f-4df1-845a-17b848af3c7f/volumes" Jan 16 17:59:00.891084 systemd-networkd[1470]: cali1e23c7e3433: Link UP Jan 16 17:59:00.892368 systemd-networkd[1470]: cali1e23c7e3433: Gained carrier Jan 16 17:59:00.919739 containerd[1547]: 2026-01-16 17:59:00.718 [INFO][3957] cni-plugin/utils.go 100: File /var/lib/calico/mtu does not exist Jan 16 17:59:00.919739 containerd[1547]: 2026-01-16 17:59:00.774 [INFO][3957] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--03fd9ab712-k8s-whisker--67547c9f8f--2pf9b-eth0 whisker-67547c9f8f- calico-system 78e68302-7ac2-49f4-8488-479ee8eaf4c9 912 0 2026-01-16 17:59:00 +0000 UTC map[app.kubernetes.io/name:whisker k8s-app:whisker pod-template-hash:67547c9f8f projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:whisker] map[] [] [] []} {k8s ci-4580-0-0-p-03fd9ab712 whisker-67547c9f8f-2pf9b eth0 whisker [] [] [kns.calico-system ksa.calico-system.whisker] cali1e23c7e3433 [] [] }} ContainerID="8c791344de5523b96e9850f66ea464d57f6c4a66343e4c8012f0388d3dfd001a" Namespace="calico-system" Pod="whisker-67547c9f8f-2pf9b" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-whisker--67547c9f8f--2pf9b-" Jan 16 17:59:00.919739 containerd[1547]: 2026-01-16 17:59:00.775 [INFO][3957] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="8c791344de5523b96e9850f66ea464d57f6c4a66343e4c8012f0388d3dfd001a" Namespace="calico-system" Pod="whisker-67547c9f8f-2pf9b" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-whisker--67547c9f8f--2pf9b-eth0" Jan 16 17:59:00.919739 containerd[1547]: 2026-01-16 17:59:00.823 [INFO][3968] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="8c791344de5523b96e9850f66ea464d57f6c4a66343e4c8012f0388d3dfd001a" HandleID="k8s-pod-network.8c791344de5523b96e9850f66ea464d57f6c4a66343e4c8012f0388d3dfd001a" Workload="ci--4580--0--0--p--03fd9ab712-k8s-whisker--67547c9f8f--2pf9b-eth0" Jan 16 17:59:00.919988 containerd[1547]: 2026-01-16 17:59:00.823 [INFO][3968] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="8c791344de5523b96e9850f66ea464d57f6c4a66343e4c8012f0388d3dfd001a" HandleID="k8s-pod-network.8c791344de5523b96e9850f66ea464d57f6c4a66343e4c8012f0388d3dfd001a" Workload="ci--4580--0--0--p--03fd9ab712-k8s-whisker--67547c9f8f--2pf9b-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2fe0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4580-0-0-p-03fd9ab712", "pod":"whisker-67547c9f8f-2pf9b", "timestamp":"2026-01-16 17:59:00.823107303 +0000 UTC"}, Hostname:"ci-4580-0-0-p-03fd9ab712", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 16 17:59:00.919988 containerd[1547]: 2026-01-16 17:59:00.823 [INFO][3968] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 16 17:59:00.919988 containerd[1547]: 2026-01-16 17:59:00.823 [INFO][3968] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 16 17:59:00.919988 containerd[1547]: 2026-01-16 17:59:00.823 [INFO][3968] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-03fd9ab712' Jan 16 17:59:00.919988 containerd[1547]: 2026-01-16 17:59:00.838 [INFO][3968] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.8c791344de5523b96e9850f66ea464d57f6c4a66343e4c8012f0388d3dfd001a" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:00.919988 containerd[1547]: 2026-01-16 17:59:00.846 [INFO][3968] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:00.919988 containerd[1547]: 2026-01-16 17:59:00.853 [INFO][3968] ipam/ipam.go 511: Trying affinity for 192.168.125.0/26 host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:00.919988 containerd[1547]: 2026-01-16 17:59:00.858 [INFO][3968] ipam/ipam.go 158: Attempting to load block cidr=192.168.125.0/26 host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:00.919988 containerd[1547]: 2026-01-16 17:59:00.861 [INFO][3968] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.125.0/26 host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:00.920221 containerd[1547]: 2026-01-16 17:59:00.861 [INFO][3968] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.125.0/26 handle="k8s-pod-network.8c791344de5523b96e9850f66ea464d57f6c4a66343e4c8012f0388d3dfd001a" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:00.920221 containerd[1547]: 2026-01-16 17:59:00.863 [INFO][3968] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.8c791344de5523b96e9850f66ea464d57f6c4a66343e4c8012f0388d3dfd001a Jan 16 17:59:00.920221 containerd[1547]: 2026-01-16 17:59:00.867 [INFO][3968] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.125.0/26 handle="k8s-pod-network.8c791344de5523b96e9850f66ea464d57f6c4a66343e4c8012f0388d3dfd001a" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:00.920221 containerd[1547]: 2026-01-16 17:59:00.876 [INFO][3968] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.125.1/26] block=192.168.125.0/26 handle="k8s-pod-network.8c791344de5523b96e9850f66ea464d57f6c4a66343e4c8012f0388d3dfd001a" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:00.920221 containerd[1547]: 2026-01-16 17:59:00.876 [INFO][3968] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.125.1/26] handle="k8s-pod-network.8c791344de5523b96e9850f66ea464d57f6c4a66343e4c8012f0388d3dfd001a" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:00.920221 containerd[1547]: 2026-01-16 17:59:00.876 [INFO][3968] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 16 17:59:00.920221 containerd[1547]: 2026-01-16 17:59:00.876 [INFO][3968] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.125.1/26] IPv6=[] ContainerID="8c791344de5523b96e9850f66ea464d57f6c4a66343e4c8012f0388d3dfd001a" HandleID="k8s-pod-network.8c791344de5523b96e9850f66ea464d57f6c4a66343e4c8012f0388d3dfd001a" Workload="ci--4580--0--0--p--03fd9ab712-k8s-whisker--67547c9f8f--2pf9b-eth0" Jan 16 17:59:00.920362 containerd[1547]: 2026-01-16 17:59:00.880 [INFO][3957] cni-plugin/k8s.go 418: Populated endpoint ContainerID="8c791344de5523b96e9850f66ea464d57f6c4a66343e4c8012f0388d3dfd001a" Namespace="calico-system" Pod="whisker-67547c9f8f-2pf9b" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-whisker--67547c9f8f--2pf9b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--03fd9ab712-k8s-whisker--67547c9f8f--2pf9b-eth0", GenerateName:"whisker-67547c9f8f-", Namespace:"calico-system", SelfLink:"", UID:"78e68302-7ac2-49f4-8488-479ee8eaf4c9", ResourceVersion:"912", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 17, 59, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"67547c9f8f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-03fd9ab712", ContainerID:"", Pod:"whisker-67547c9f8f-2pf9b", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.125.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali1e23c7e3433", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 17:59:00.920362 containerd[1547]: 2026-01-16 17:59:00.880 [INFO][3957] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.125.1/32] ContainerID="8c791344de5523b96e9850f66ea464d57f6c4a66343e4c8012f0388d3dfd001a" Namespace="calico-system" Pod="whisker-67547c9f8f-2pf9b" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-whisker--67547c9f8f--2pf9b-eth0" Jan 16 17:59:00.920434 containerd[1547]: 2026-01-16 17:59:00.880 [INFO][3957] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali1e23c7e3433 ContainerID="8c791344de5523b96e9850f66ea464d57f6c4a66343e4c8012f0388d3dfd001a" Namespace="calico-system" Pod="whisker-67547c9f8f-2pf9b" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-whisker--67547c9f8f--2pf9b-eth0" Jan 16 17:59:00.920434 containerd[1547]: 2026-01-16 17:59:00.893 [INFO][3957] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="8c791344de5523b96e9850f66ea464d57f6c4a66343e4c8012f0388d3dfd001a" Namespace="calico-system" Pod="whisker-67547c9f8f-2pf9b" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-whisker--67547c9f8f--2pf9b-eth0" Jan 16 17:59:00.920479 containerd[1547]: 2026-01-16 17:59:00.894 [INFO][3957] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="8c791344de5523b96e9850f66ea464d57f6c4a66343e4c8012f0388d3dfd001a" Namespace="calico-system" Pod="whisker-67547c9f8f-2pf9b" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-whisker--67547c9f8f--2pf9b-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--03fd9ab712-k8s-whisker--67547c9f8f--2pf9b-eth0", GenerateName:"whisker-67547c9f8f-", Namespace:"calico-system", SelfLink:"", UID:"78e68302-7ac2-49f4-8488-479ee8eaf4c9", ResourceVersion:"912", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 17, 59, 0, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"whisker", "k8s-app":"whisker", "pod-template-hash":"67547c9f8f", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"whisker"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-03fd9ab712", ContainerID:"8c791344de5523b96e9850f66ea464d57f6c4a66343e4c8012f0388d3dfd001a", Pod:"whisker-67547c9f8f-2pf9b", Endpoint:"eth0", ServiceAccountName:"whisker", IPNetworks:[]string{"192.168.125.1/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.whisker"}, InterfaceName:"cali1e23c7e3433", MAC:"fe:6f:1a:78:22:68", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 17:59:00.920527 containerd[1547]: 2026-01-16 17:59:00.915 [INFO][3957] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="8c791344de5523b96e9850f66ea464d57f6c4a66343e4c8012f0388d3dfd001a" Namespace="calico-system" Pod="whisker-67547c9f8f-2pf9b" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-whisker--67547c9f8f--2pf9b-eth0" Jan 16 17:59:00.948934 containerd[1547]: time="2026-01-16T17:59:00.948691787Z" level=info msg="connecting to shim 8c791344de5523b96e9850f66ea464d57f6c4a66343e4c8012f0388d3dfd001a" address="unix:///run/containerd/s/53ab1620ff528f40914d2583eec7ede8efe009912d9bb96981e67bb88865a255" namespace=k8s.io protocol=ttrpc version=3 Jan 16 17:59:00.973780 systemd[1]: Started cri-containerd-8c791344de5523b96e9850f66ea464d57f6c4a66343e4c8012f0388d3dfd001a.scope - libcontainer container 8c791344de5523b96e9850f66ea464d57f6c4a66343e4c8012f0388d3dfd001a. Jan 16 17:59:00.986000 audit: BPF prog-id=173 op=LOAD Jan 16 17:59:00.987000 audit: BPF prog-id=174 op=LOAD Jan 16 17:59:00.987000 audit[4002]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000138180 a2=98 a3=0 items=0 ppid=3991 pid=4002 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:00.987000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863373931333434646535353233623936653938353066363665613436 Jan 16 17:59:00.987000 audit: BPF prog-id=174 op=UNLOAD Jan 16 17:59:00.987000 audit[4002]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3991 pid=4002 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:00.987000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863373931333434646535353233623936653938353066363665613436 Jan 16 17:59:00.988000 audit: BPF prog-id=175 op=LOAD Jan 16 17:59:00.988000 audit[4002]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001383e8 a2=98 a3=0 items=0 ppid=3991 pid=4002 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:00.988000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863373931333434646535353233623936653938353066363665613436 Jan 16 17:59:00.988000 audit: BPF prog-id=176 op=LOAD Jan 16 17:59:00.988000 audit[4002]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000138168 a2=98 a3=0 items=0 ppid=3991 pid=4002 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:00.988000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863373931333434646535353233623936653938353066363665613436 Jan 16 17:59:00.988000 audit: BPF prog-id=176 op=UNLOAD Jan 16 17:59:00.988000 audit[4002]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=3991 pid=4002 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:00.988000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863373931333434646535353233623936653938353066363665613436 Jan 16 17:59:00.988000 audit: BPF prog-id=175 op=UNLOAD Jan 16 17:59:00.988000 audit[4002]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=3991 pid=4002 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:00.988000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863373931333434646535353233623936653938353066363665613436 Jan 16 17:59:00.988000 audit: BPF prog-id=177 op=LOAD Jan 16 17:59:00.988000 audit[4002]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000138648 a2=98 a3=0 items=0 ppid=3991 pid=4002 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:00.988000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3863373931333434646535353233623936653938353066363665613436 Jan 16 17:59:01.035579 containerd[1547]: time="2026-01-16T17:59:01.034748408Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:whisker-67547c9f8f-2pf9b,Uid:78e68302-7ac2-49f4-8488-479ee8eaf4c9,Namespace:calico-system,Attempt:0,} returns sandbox id \"8c791344de5523b96e9850f66ea464d57f6c4a66343e4c8012f0388d3dfd001a\"" Jan 16 17:59:01.037693 containerd[1547]: time="2026-01-16T17:59:01.037634547Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 16 17:59:01.369204 containerd[1547]: time="2026-01-16T17:59:01.368926462Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 17:59:01.384789 containerd[1547]: time="2026-01-16T17:59:01.384645381Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 16 17:59:01.384789 containerd[1547]: time="2026-01-16T17:59:01.384692424Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 16 17:59:01.385031 kubelet[2825]: E0116 17:59:01.384950 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 17:59:01.385031 kubelet[2825]: E0116 17:59:01.385002 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 17:59:01.387395 kubelet[2825]: E0116 17:59:01.387354 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-67547c9f8f-2pf9b_calico-system(78e68302-7ac2-49f4-8488-479ee8eaf4c9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 16 17:59:01.389255 containerd[1547]: time="2026-01-16T17:59:01.389212842Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 16 17:59:01.640000 audit: BPF prog-id=178 op=LOAD Jan 16 17:59:01.640000 audit[4176]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffcaaa9898 a2=98 a3=ffffcaaa9888 items=0 ppid=4060 pid=4176 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.640000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 16 17:59:01.640000 audit: BPF prog-id=178 op=UNLOAD Jan 16 17:59:01.640000 audit[4176]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffcaaa9868 a3=0 items=0 ppid=4060 pid=4176 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.640000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 16 17:59:01.641000 audit: BPF prog-id=179 op=LOAD Jan 16 17:59:01.641000 audit[4176]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffcaaa9748 a2=74 a3=95 items=0 ppid=4060 pid=4176 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.641000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 16 17:59:01.641000 audit: BPF prog-id=179 op=UNLOAD Jan 16 17:59:01.641000 audit[4176]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4060 pid=4176 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.641000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 16 17:59:01.641000 audit: BPF prog-id=180 op=LOAD Jan 16 17:59:01.641000 audit[4176]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffcaaa9778 a2=40 a3=ffffcaaa97a8 items=0 ppid=4060 pid=4176 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.641000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 16 17:59:01.641000 audit: BPF prog-id=180 op=UNLOAD Jan 16 17:59:01.641000 audit[4176]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffcaaa97a8 items=0 ppid=4060 pid=4176 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.641000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F74632F676C6F62616C732F63616C695F63746C625F70726F677300747970650070726F675F6172726179006B657900340076616C7565003400656E74726965730033006E616D650063616C695F63746C625F70726F677300666C6167730030 Jan 16 17:59:01.643000 audit: BPF prog-id=181 op=LOAD Jan 16 17:59:01.643000 audit[4177]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffed4912e8 a2=98 a3=ffffed4912d8 items=0 ppid=4060 pid=4177 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.643000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 17:59:01.643000 audit: BPF prog-id=181 op=UNLOAD Jan 16 17:59:01.643000 audit[4177]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffed4912b8 a3=0 items=0 ppid=4060 pid=4177 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.643000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 17:59:01.643000 audit: BPF prog-id=182 op=LOAD Jan 16 17:59:01.643000 audit[4177]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffed490f78 a2=74 a3=95 items=0 ppid=4060 pid=4177 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.643000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 17:59:01.643000 audit: BPF prog-id=182 op=UNLOAD Jan 16 17:59:01.643000 audit[4177]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4060 pid=4177 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.643000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 17:59:01.643000 audit: BPF prog-id=183 op=LOAD Jan 16 17:59:01.643000 audit[4177]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffed490fd8 a2=94 a3=2 items=0 ppid=4060 pid=4177 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.643000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 17:59:01.643000 audit: BPF prog-id=183 op=UNLOAD Jan 16 17:59:01.643000 audit[4177]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4060 pid=4177 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.643000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 17:59:01.742814 containerd[1547]: time="2026-01-16T17:59:01.742719910Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 17:59:01.744343 containerd[1547]: time="2026-01-16T17:59:01.744271905Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 16 17:59:01.744576 containerd[1547]: time="2026-01-16T17:59:01.744354269Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 16 17:59:01.744644 kubelet[2825]: E0116 17:59:01.744584 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 16 17:59:01.744700 kubelet[2825]: E0116 17:59:01.744630 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 16 17:59:01.744760 kubelet[2825]: E0116 17:59:01.744731 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-67547c9f8f-2pf9b_calico-system(78e68302-7ac2-49f4-8488-479ee8eaf4c9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 16 17:59:01.744806 kubelet[2825]: E0116 17:59:01.744781 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-67547c9f8f-2pf9b" podUID="78e68302-7ac2-49f4-8488-479ee8eaf4c9" Jan 16 17:59:01.755000 audit: BPF prog-id=184 op=LOAD Jan 16 17:59:01.755000 audit[4177]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=ffffed490f98 a2=40 a3=ffffed490fc8 items=0 ppid=4060 pid=4177 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.755000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 17:59:01.755000 audit: BPF prog-id=184 op=UNLOAD Jan 16 17:59:01.755000 audit[4177]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=ffffed490fc8 items=0 ppid=4060 pid=4177 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.755000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 17:59:01.764000 audit: BPF prog-id=185 op=LOAD Jan 16 17:59:01.764000 audit[4177]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffed490fa8 a2=94 a3=4 items=0 ppid=4060 pid=4177 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.764000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 17:59:01.765000 audit: BPF prog-id=185 op=UNLOAD Jan 16 17:59:01.765000 audit[4177]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4060 pid=4177 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.765000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 17:59:01.765000 audit: BPF prog-id=186 op=LOAD Jan 16 17:59:01.765000 audit[4177]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffed490de8 a2=94 a3=5 items=0 ppid=4060 pid=4177 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.765000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 17:59:01.765000 audit: BPF prog-id=186 op=UNLOAD Jan 16 17:59:01.765000 audit[4177]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4060 pid=4177 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.765000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 17:59:01.765000 audit: BPF prog-id=187 op=LOAD Jan 16 17:59:01.765000 audit[4177]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffed491018 a2=94 a3=6 items=0 ppid=4060 pid=4177 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.765000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 17:59:01.765000 audit: BPF prog-id=187 op=UNLOAD Jan 16 17:59:01.765000 audit[4177]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4060 pid=4177 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.765000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 17:59:01.765000 audit: BPF prog-id=188 op=LOAD Jan 16 17:59:01.765000 audit[4177]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=ffffed4907e8 a2=94 a3=83 items=0 ppid=4060 pid=4177 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.765000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 17:59:01.765000 audit: BPF prog-id=189 op=LOAD Jan 16 17:59:01.765000 audit[4177]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=ffffed4905a8 a2=94 a3=2 items=0 ppid=4060 pid=4177 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.765000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 17:59:01.766000 audit: BPF prog-id=189 op=UNLOAD Jan 16 17:59:01.766000 audit[4177]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4060 pid=4177 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.766000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 17:59:01.766000 audit: BPF prog-id=188 op=UNLOAD Jan 16 17:59:01.766000 audit[4177]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=2c489620 a3=2c47cb00 items=0 ppid=4060 pid=4177 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.766000 audit: PROCTITLE proctitle=627066746F6F6C006D6170006C697374002D2D6A736F6E Jan 16 17:59:01.780000 audit: BPF prog-id=190 op=LOAD Jan 16 17:59:01.780000 audit[4180]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd36165e8 a2=98 a3=ffffd36165d8 items=0 ppid=4060 pid=4180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.780000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 16 17:59:01.780000 audit: BPF prog-id=190 op=UNLOAD Jan 16 17:59:01.780000 audit[4180]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffd36165b8 a3=0 items=0 ppid=4060 pid=4180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.780000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 16 17:59:01.780000 audit: BPF prog-id=191 op=LOAD Jan 16 17:59:01.780000 audit[4180]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd3616498 a2=74 a3=95 items=0 ppid=4060 pid=4180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.780000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 16 17:59:01.780000 audit: BPF prog-id=191 op=UNLOAD Jan 16 17:59:01.780000 audit[4180]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4060 pid=4180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.780000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 16 17:59:01.780000 audit: BPF prog-id=192 op=LOAD Jan 16 17:59:01.780000 audit[4180]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffd36164c8 a2=40 a3=ffffd36164f8 items=0 ppid=4060 pid=4180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.780000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 16 17:59:01.780000 audit: BPF prog-id=192 op=UNLOAD Jan 16 17:59:01.780000 audit[4180]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=40 a3=ffffd36164f8 items=0 ppid=4060 pid=4180 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.780000 audit: PROCTITLE proctitle=627066746F6F6C006D617000637265617465002F7379732F66732F6270662F63616C69636F2F63616C69636F5F6661696C736166655F706F7274735F763100747970650068617368006B657900340076616C7565003100656E7472696573003635353335006E616D650063616C69636F5F6661696C736166655F706F7274735F Jan 16 17:59:01.851771 systemd-networkd[1470]: vxlan.calico: Link UP Jan 16 17:59:01.851785 systemd-networkd[1470]: vxlan.calico: Gained carrier Jan 16 17:59:01.883000 audit: BPF prog-id=193 op=LOAD Jan 16 17:59:01.883000 audit[4207]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffc887e708 a2=98 a3=ffffc887e6f8 items=0 ppid=4060 pid=4207 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.883000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 17:59:01.883000 audit: BPF prog-id=193 op=UNLOAD Jan 16 17:59:01.883000 audit[4207]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=ffffc887e6d8 a3=0 items=0 ppid=4060 pid=4207 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.883000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 17:59:01.883000 audit: BPF prog-id=194 op=LOAD Jan 16 17:59:01.883000 audit[4207]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffc887e3e8 a2=74 a3=95 items=0 ppid=4060 pid=4207 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.883000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 17:59:01.883000 audit: BPF prog-id=194 op=UNLOAD Jan 16 17:59:01.883000 audit[4207]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=74 a3=95 items=0 ppid=4060 pid=4207 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.883000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 17:59:01.883000 audit: BPF prog-id=195 op=LOAD Jan 16 17:59:01.883000 audit[4207]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=ffffc887e448 a2=94 a3=2 items=0 ppid=4060 pid=4207 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.883000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 17:59:01.883000 audit: BPF prog-id=195 op=UNLOAD Jan 16 17:59:01.883000 audit[4207]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=70 a3=2 items=0 ppid=4060 pid=4207 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.883000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 17:59:01.883000 audit: BPF prog-id=196 op=LOAD Jan 16 17:59:01.883000 audit[4207]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffc887e2c8 a2=40 a3=ffffc887e2f8 items=0 ppid=4060 pid=4207 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.883000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 17:59:01.883000 audit: BPF prog-id=196 op=UNLOAD Jan 16 17:59:01.883000 audit[4207]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=40 a3=ffffc887e2f8 items=0 ppid=4060 pid=4207 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.883000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 17:59:01.883000 audit: BPF prog-id=197 op=LOAD Jan 16 17:59:01.883000 audit[4207]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffc887e418 a2=94 a3=b7 items=0 ppid=4060 pid=4207 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.883000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 17:59:01.883000 audit: BPF prog-id=197 op=UNLOAD Jan 16 17:59:01.883000 audit[4207]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=b7 items=0 ppid=4060 pid=4207 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.883000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 17:59:01.885000 audit: BPF prog-id=198 op=LOAD Jan 16 17:59:01.885000 audit[4207]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffc887dac8 a2=94 a3=2 items=0 ppid=4060 pid=4207 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.885000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 17:59:01.885000 audit: BPF prog-id=198 op=UNLOAD Jan 16 17:59:01.885000 audit[4207]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=2 items=0 ppid=4060 pid=4207 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.885000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 17:59:01.885000 audit: BPF prog-id=199 op=LOAD Jan 16 17:59:01.885000 audit[4207]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=ffffc887dc58 a2=94 a3=30 items=0 ppid=4060 pid=4207 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.885000 audit: PROCTITLE proctitle=627066746F6F6C0070726F67006C6F6164002F7573722F6C69622F63616C69636F2F6270662F66696C7465722E6F002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41007479706500786470 Jan 16 17:59:01.892000 audit: BPF prog-id=200 op=LOAD Jan 16 17:59:01.892000 audit[4211]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=3 a0=5 a1=fffff3144478 a2=98 a3=fffff3144468 items=0 ppid=4060 pid=4211 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.892000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 17:59:01.892000 audit: BPF prog-id=200 op=UNLOAD Jan 16 17:59:01.892000 audit[4211]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=3 a1=57156c a2=fffff3144448 a3=0 items=0 ppid=4060 pid=4211 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.892000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 17:59:01.892000 audit: BPF prog-id=201 op=LOAD Jan 16 17:59:01.892000 audit[4211]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=fffff3144108 a2=74 a3=95 items=0 ppid=4060 pid=4211 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.892000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 17:59:01.892000 audit: BPF prog-id=201 op=UNLOAD Jan 16 17:59:01.892000 audit[4211]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=74 a3=95 items=0 ppid=4060 pid=4211 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.892000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 17:59:01.892000 audit: BPF prog-id=202 op=LOAD Jan 16 17:59:01.892000 audit[4211]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=fffff3144168 a2=94 a3=2 items=0 ppid=4060 pid=4211 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.892000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 17:59:01.892000 audit: BPF prog-id=202 op=UNLOAD Jan 16 17:59:01.892000 audit[4211]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=70 a3=2 items=0 ppid=4060 pid=4211 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:01.892000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 17:59:02.002592 kubelet[2825]: E0116 17:59:02.002480 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-67547c9f8f-2pf9b" podUID="78e68302-7ac2-49f4-8488-479ee8eaf4c9" Jan 16 17:59:02.032000 audit: BPF prog-id=203 op=LOAD Jan 16 17:59:02.032000 audit[4211]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=4 a0=5 a1=fffff3144128 a2=40 a3=fffff3144158 items=0 ppid=4060 pid=4211 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:02.032000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 17:59:02.032000 audit: BPF prog-id=203 op=UNLOAD Jan 16 17:59:02.032000 audit[4211]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=4 a1=57156c a2=40 a3=fffff3144158 items=0 ppid=4060 pid=4211 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:02.032000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 17:59:02.038000 audit[4233]: NETFILTER_CFG table=filter:119 family=2 entries=20 op=nft_register_rule pid=4233 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:59:02.038000 audit[4233]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffccc5ecc0 a2=0 a3=1 items=0 ppid=2976 pid=4233 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:02.038000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:59:02.041000 audit[4233]: NETFILTER_CFG table=nat:120 family=2 entries=14 op=nft_register_rule pid=4233 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:59:02.041000 audit[4233]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffccc5ecc0 a2=0 a3=1 items=0 ppid=2976 pid=4233 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:02.041000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:59:02.049000 audit: BPF prog-id=204 op=LOAD Jan 16 17:59:02.049000 audit[4211]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=fffff3144138 a2=94 a3=4 items=0 ppid=4060 pid=4211 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:02.049000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 17:59:02.049000 audit: BPF prog-id=204 op=UNLOAD Jan 16 17:59:02.049000 audit[4211]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=4 items=0 ppid=4060 pid=4211 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:02.049000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 17:59:02.049000 audit: BPF prog-id=205 op=LOAD Jan 16 17:59:02.049000 audit[4211]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=6 a0=5 a1=fffff3143f78 a2=94 a3=5 items=0 ppid=4060 pid=4211 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:02.049000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 17:59:02.049000 audit: BPF prog-id=205 op=UNLOAD Jan 16 17:59:02.049000 audit[4211]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=6 a1=57156c a2=70 a3=5 items=0 ppid=4060 pid=4211 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:02.049000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 17:59:02.049000 audit: BPF prog-id=206 op=LOAD Jan 16 17:59:02.049000 audit[4211]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=fffff31441a8 a2=94 a3=6 items=0 ppid=4060 pid=4211 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:02.049000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 17:59:02.049000 audit: BPF prog-id=206 op=UNLOAD Jan 16 17:59:02.049000 audit[4211]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=70 a3=6 items=0 ppid=4060 pid=4211 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:02.049000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 17:59:02.050000 audit: BPF prog-id=207 op=LOAD Jan 16 17:59:02.050000 audit[4211]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=5 a0=5 a1=fffff3143978 a2=94 a3=83 items=0 ppid=4060 pid=4211 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:02.050000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 17:59:02.050000 audit: BPF prog-id=208 op=LOAD Jan 16 17:59:02.050000 audit[4211]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=7 a0=5 a1=fffff3143738 a2=94 a3=2 items=0 ppid=4060 pid=4211 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:02.050000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 17:59:02.050000 audit: BPF prog-id=208 op=UNLOAD Jan 16 17:59:02.050000 audit[4211]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=7 a1=57156c a2=c a3=0 items=0 ppid=4060 pid=4211 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:02.050000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 17:59:02.050000 audit: BPF prog-id=207 op=UNLOAD Jan 16 17:59:02.050000 audit[4211]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=5 a1=57156c a2=1a832620 a3=1a825b00 items=0 ppid=4060 pid=4211 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="bpftool" exe="/usr/bin/bpftool" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:02.050000 audit: PROCTITLE proctitle=627066746F6F6C002D2D6A736F6E002D2D7072657474790070726F670073686F770070696E6E6564002F7379732F66732F6270662F63616C69636F2F7864702F70726566696C7465725F76315F63616C69636F5F746D705F41 Jan 16 17:59:02.057000 audit: BPF prog-id=199 op=UNLOAD Jan 16 17:59:02.057000 audit[4060]: SYSCALL arch=c00000b7 syscall=35 success=yes exit=0 a0=ffffffffffffff9c a1=4000e75200 a2=0 a3=0 items=0 ppid=4053 pid=4060 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="calico-node" exe="/usr/bin/calico-node" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:02.057000 audit: PROCTITLE proctitle=63616C69636F2D6E6F6465002D66656C6978 Jan 16 17:59:02.144000 audit[4261]: NETFILTER_CFG table=nat:121 family=2 entries=15 op=nft_register_chain pid=4261 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 17:59:02.144000 audit[4261]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5084 a0=3 a1=ffffef913dc0 a2=0 a3=ffffb5136fa8 items=0 ppid=4060 pid=4261 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:02.144000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 17:59:02.153000 audit[4267]: NETFILTER_CFG table=mangle:122 family=2 entries=16 op=nft_register_chain pid=4267 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 17:59:02.153000 audit[4267]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=6868 a0=3 a1=ffffc6d10b90 a2=0 a3=ffffb8404fa8 items=0 ppid=4060 pid=4267 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:02.153000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 17:59:02.161000 audit[4262]: NETFILTER_CFG table=raw:123 family=2 entries=21 op=nft_register_chain pid=4262 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 17:59:02.161000 audit[4262]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=8452 a0=3 a1=ffffe2dde0f0 a2=0 a3=ffffa68e1fa8 items=0 ppid=4060 pid=4262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:02.161000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 17:59:02.163000 audit[4263]: NETFILTER_CFG table=filter:124 family=2 entries=94 op=nft_register_chain pid=4263 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 17:59:02.163000 audit[4263]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=53116 a0=3 a1=ffffd3fa0160 a2=0 a3=ffffbe71dfa8 items=0 ppid=4060 pid=4263 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:02.163000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 17:59:02.748339 systemd-networkd[1470]: cali1e23c7e3433: Gained IPv6LL Jan 16 17:59:02.757754 containerd[1547]: time="2026-01-16T17:59:02.755953686Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-gshmq,Uid:e6a77609-327b-44be-be8e-e3bd4ce03cac,Namespace:kube-system,Attempt:0,}" Jan 16 17:59:02.913472 systemd-networkd[1470]: cali5d9c0a47872: Link UP Jan 16 17:59:02.915394 systemd-networkd[1470]: cali5d9c0a47872: Gained carrier Jan 16 17:59:02.939006 containerd[1547]: 2026-01-16 17:59:02.812 [INFO][4278] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--03fd9ab712-k8s-coredns--66bc5c9577--gshmq-eth0 coredns-66bc5c9577- kube-system e6a77609-327b-44be-be8e-e3bd4ce03cac 839 0 2026-01-16 17:58:14 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4580-0-0-p-03fd9ab712 coredns-66bc5c9577-gshmq eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali5d9c0a47872 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="c121465426c4008385b60729b13195b4aa616f0f54749129fa2022b601b80bbd" Namespace="kube-system" Pod="coredns-66bc5c9577-gshmq" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-coredns--66bc5c9577--gshmq-" Jan 16 17:59:02.939006 containerd[1547]: 2026-01-16 17:59:02.812 [INFO][4278] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c121465426c4008385b60729b13195b4aa616f0f54749129fa2022b601b80bbd" Namespace="kube-system" Pod="coredns-66bc5c9577-gshmq" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-coredns--66bc5c9577--gshmq-eth0" Jan 16 17:59:02.939006 containerd[1547]: 2026-01-16 17:59:02.850 [INFO][4290] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c121465426c4008385b60729b13195b4aa616f0f54749129fa2022b601b80bbd" HandleID="k8s-pod-network.c121465426c4008385b60729b13195b4aa616f0f54749129fa2022b601b80bbd" Workload="ci--4580--0--0--p--03fd9ab712-k8s-coredns--66bc5c9577--gshmq-eth0" Jan 16 17:59:02.939227 containerd[1547]: 2026-01-16 17:59:02.851 [INFO][4290] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="c121465426c4008385b60729b13195b4aa616f0f54749129fa2022b601b80bbd" HandleID="k8s-pod-network.c121465426c4008385b60729b13195b4aa616f0f54749129fa2022b601b80bbd" Workload="ci--4580--0--0--p--03fd9ab712-k8s-coredns--66bc5c9577--gshmq-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d3870), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4580-0-0-p-03fd9ab712", "pod":"coredns-66bc5c9577-gshmq", "timestamp":"2026-01-16 17:59:02.850866746 +0000 UTC"}, Hostname:"ci-4580-0-0-p-03fd9ab712", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 16 17:59:02.939227 containerd[1547]: 2026-01-16 17:59:02.851 [INFO][4290] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 16 17:59:02.939227 containerd[1547]: 2026-01-16 17:59:02.851 [INFO][4290] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 16 17:59:02.939227 containerd[1547]: 2026-01-16 17:59:02.851 [INFO][4290] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-03fd9ab712' Jan 16 17:59:02.939227 containerd[1547]: 2026-01-16 17:59:02.864 [INFO][4290] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c121465426c4008385b60729b13195b4aa616f0f54749129fa2022b601b80bbd" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:02.939227 containerd[1547]: 2026-01-16 17:59:02.871 [INFO][4290] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:02.939227 containerd[1547]: 2026-01-16 17:59:02.878 [INFO][4290] ipam/ipam.go 511: Trying affinity for 192.168.125.0/26 host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:02.939227 containerd[1547]: 2026-01-16 17:59:02.881 [INFO][4290] ipam/ipam.go 158: Attempting to load block cidr=192.168.125.0/26 host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:02.939227 containerd[1547]: 2026-01-16 17:59:02.884 [INFO][4290] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.125.0/26 host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:02.939405 containerd[1547]: 2026-01-16 17:59:02.884 [INFO][4290] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.125.0/26 handle="k8s-pod-network.c121465426c4008385b60729b13195b4aa616f0f54749129fa2022b601b80bbd" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:02.939405 containerd[1547]: 2026-01-16 17:59:02.887 [INFO][4290] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.c121465426c4008385b60729b13195b4aa616f0f54749129fa2022b601b80bbd Jan 16 17:59:02.939405 containerd[1547]: 2026-01-16 17:59:02.896 [INFO][4290] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.125.0/26 handle="k8s-pod-network.c121465426c4008385b60729b13195b4aa616f0f54749129fa2022b601b80bbd" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:02.939405 containerd[1547]: 2026-01-16 17:59:02.904 [INFO][4290] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.125.2/26] block=192.168.125.0/26 handle="k8s-pod-network.c121465426c4008385b60729b13195b4aa616f0f54749129fa2022b601b80bbd" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:02.939405 containerd[1547]: 2026-01-16 17:59:02.905 [INFO][4290] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.125.2/26] handle="k8s-pod-network.c121465426c4008385b60729b13195b4aa616f0f54749129fa2022b601b80bbd" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:02.939405 containerd[1547]: 2026-01-16 17:59:02.905 [INFO][4290] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 16 17:59:02.939405 containerd[1547]: 2026-01-16 17:59:02.905 [INFO][4290] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.125.2/26] IPv6=[] ContainerID="c121465426c4008385b60729b13195b4aa616f0f54749129fa2022b601b80bbd" HandleID="k8s-pod-network.c121465426c4008385b60729b13195b4aa616f0f54749129fa2022b601b80bbd" Workload="ci--4580--0--0--p--03fd9ab712-k8s-coredns--66bc5c9577--gshmq-eth0" Jan 16 17:59:02.940614 containerd[1547]: 2026-01-16 17:59:02.908 [INFO][4278] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c121465426c4008385b60729b13195b4aa616f0f54749129fa2022b601b80bbd" Namespace="kube-system" Pod="coredns-66bc5c9577-gshmq" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-coredns--66bc5c9577--gshmq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--03fd9ab712-k8s-coredns--66bc5c9577--gshmq-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"e6a77609-327b-44be-be8e-e3bd4ce03cac", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 17, 58, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-03fd9ab712", ContainerID:"", Pod:"coredns-66bc5c9577-gshmq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.125.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5d9c0a47872", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 17:59:02.940614 containerd[1547]: 2026-01-16 17:59:02.909 [INFO][4278] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.125.2/32] ContainerID="c121465426c4008385b60729b13195b4aa616f0f54749129fa2022b601b80bbd" Namespace="kube-system" Pod="coredns-66bc5c9577-gshmq" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-coredns--66bc5c9577--gshmq-eth0" Jan 16 17:59:02.940614 containerd[1547]: 2026-01-16 17:59:02.909 [INFO][4278] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali5d9c0a47872 ContainerID="c121465426c4008385b60729b13195b4aa616f0f54749129fa2022b601b80bbd" Namespace="kube-system" Pod="coredns-66bc5c9577-gshmq" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-coredns--66bc5c9577--gshmq-eth0" Jan 16 17:59:02.940614 containerd[1547]: 2026-01-16 17:59:02.916 [INFO][4278] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c121465426c4008385b60729b13195b4aa616f0f54749129fa2022b601b80bbd" Namespace="kube-system" Pod="coredns-66bc5c9577-gshmq" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-coredns--66bc5c9577--gshmq-eth0" Jan 16 17:59:02.940614 containerd[1547]: 2026-01-16 17:59:02.917 [INFO][4278] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c121465426c4008385b60729b13195b4aa616f0f54749129fa2022b601b80bbd" Namespace="kube-system" Pod="coredns-66bc5c9577-gshmq" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-coredns--66bc5c9577--gshmq-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--03fd9ab712-k8s-coredns--66bc5c9577--gshmq-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"e6a77609-327b-44be-be8e-e3bd4ce03cac", ResourceVersion:"839", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 17, 58, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-03fd9ab712", ContainerID:"c121465426c4008385b60729b13195b4aa616f0f54749129fa2022b601b80bbd", Pod:"coredns-66bc5c9577-gshmq", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.125.2/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali5d9c0a47872", MAC:"fa:a5:62:38:7f:07", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 17:59:02.940853 containerd[1547]: 2026-01-16 17:59:02.934 [INFO][4278] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c121465426c4008385b60729b13195b4aa616f0f54749129fa2022b601b80bbd" Namespace="kube-system" Pod="coredns-66bc5c9577-gshmq" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-coredns--66bc5c9577--gshmq-eth0" Jan 16 17:59:02.978284 containerd[1547]: time="2026-01-16T17:59:02.977678650Z" level=info msg="connecting to shim c121465426c4008385b60729b13195b4aa616f0f54749129fa2022b601b80bbd" address="unix:///run/containerd/s/52bedc0eb182c70820339d83f316ac70ddfa2f4e0634215484cb7038f015a21a" namespace=k8s.io protocol=ttrpc version=3 Jan 16 17:59:03.006585 kubelet[2825]: E0116 17:59:03.006352 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-67547c9f8f-2pf9b" podUID="78e68302-7ac2-49f4-8488-479ee8eaf4c9" Jan 16 17:59:03.021092 systemd[1]: Started cri-containerd-c121465426c4008385b60729b13195b4aa616f0f54749129fa2022b601b80bbd.scope - libcontainer container c121465426c4008385b60729b13195b4aa616f0f54749129fa2022b601b80bbd. Jan 16 17:59:03.089000 audit: BPF prog-id=209 op=LOAD Jan 16 17:59:03.089000 audit: BPF prog-id=210 op=LOAD Jan 16 17:59:03.089000 audit[4322]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=4310 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:03.089000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331323134363534323663343030383338356236303732396231333139 Jan 16 17:59:03.089000 audit: BPF prog-id=210 op=UNLOAD Jan 16 17:59:03.089000 audit[4322]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4310 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:03.089000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331323134363534323663343030383338356236303732396231333139 Jan 16 17:59:03.090000 audit: BPF prog-id=211 op=LOAD Jan 16 17:59:03.090000 audit[4322]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=4310 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:03.090000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331323134363534323663343030383338356236303732396231333139 Jan 16 17:59:03.090000 audit: BPF prog-id=212 op=LOAD Jan 16 17:59:03.090000 audit[4322]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=4310 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:03.090000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331323134363534323663343030383338356236303732396231333139 Jan 16 17:59:03.091000 audit: BPF prog-id=212 op=UNLOAD Jan 16 17:59:03.091000 audit[4322]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4310 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:03.091000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331323134363534323663343030383338356236303732396231333139 Jan 16 17:59:03.091000 audit: BPF prog-id=211 op=UNLOAD Jan 16 17:59:03.091000 audit[4322]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4310 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:03.091000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331323134363534323663343030383338356236303732396231333139 Jan 16 17:59:03.091000 audit: BPF prog-id=213 op=LOAD Jan 16 17:59:03.091000 audit[4322]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=4310 pid=4322 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:03.091000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331323134363534323663343030383338356236303732396231333139 Jan 16 17:59:03.100000 audit[4344]: NETFILTER_CFG table=filter:125 family=2 entries=42 op=nft_register_chain pid=4344 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 17:59:03.100000 audit[4344]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=22552 a0=3 a1=fffff4da5f40 a2=0 a3=ffffb488efa8 items=0 ppid=4060 pid=4344 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:03.100000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 17:59:03.139043 containerd[1547]: time="2026-01-16T17:59:03.138995907Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-gshmq,Uid:e6a77609-327b-44be-be8e-e3bd4ce03cac,Namespace:kube-system,Attempt:0,} returns sandbox id \"c121465426c4008385b60729b13195b4aa616f0f54749129fa2022b601b80bbd\"" Jan 16 17:59:03.145930 containerd[1547]: time="2026-01-16T17:59:03.145789149Z" level=info msg="CreateContainer within sandbox \"c121465426c4008385b60729b13195b4aa616f0f54749129fa2022b601b80bbd\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 16 17:59:03.166582 containerd[1547]: time="2026-01-16T17:59:03.166514731Z" level=info msg="Container 32e9a3f27327b2010043f3149737fd82912b4ebb1e81887eec872c7398b57923: CDI devices from CRI Config.CDIDevices: []" Jan 16 17:59:03.172919 containerd[1547]: time="2026-01-16T17:59:03.172845351Z" level=info msg="CreateContainer within sandbox \"c121465426c4008385b60729b13195b4aa616f0f54749129fa2022b601b80bbd\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"32e9a3f27327b2010043f3149737fd82912b4ebb1e81887eec872c7398b57923\"" Jan 16 17:59:03.173529 containerd[1547]: time="2026-01-16T17:59:03.173498302Z" level=info msg="StartContainer for \"32e9a3f27327b2010043f3149737fd82912b4ebb1e81887eec872c7398b57923\"" Jan 16 17:59:03.174702 containerd[1547]: time="2026-01-16T17:59:03.174625235Z" level=info msg="connecting to shim 32e9a3f27327b2010043f3149737fd82912b4ebb1e81887eec872c7398b57923" address="unix:///run/containerd/s/52bedc0eb182c70820339d83f316ac70ddfa2f4e0634215484cb7038f015a21a" protocol=ttrpc version=3 Jan 16 17:59:03.203795 systemd[1]: Started cri-containerd-32e9a3f27327b2010043f3149737fd82912b4ebb1e81887eec872c7398b57923.scope - libcontainer container 32e9a3f27327b2010043f3149737fd82912b4ebb1e81887eec872c7398b57923. Jan 16 17:59:03.224000 audit: BPF prog-id=214 op=LOAD Jan 16 17:59:03.225000 audit: BPF prog-id=215 op=LOAD Jan 16 17:59:03.225000 audit[4352]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0180 a2=98 a3=0 items=0 ppid=4310 pid=4352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:03.225000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653961336632373332376232303130303433663331343937333766 Jan 16 17:59:03.225000 audit: BPF prog-id=215 op=UNLOAD Jan 16 17:59:03.225000 audit[4352]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4310 pid=4352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:03.225000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653961336632373332376232303130303433663331343937333766 Jan 16 17:59:03.226000 audit: BPF prog-id=216 op=LOAD Jan 16 17:59:03.226000 audit[4352]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b03e8 a2=98 a3=0 items=0 ppid=4310 pid=4352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:03.226000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653961336632373332376232303130303433663331343937333766 Jan 16 17:59:03.226000 audit: BPF prog-id=217 op=LOAD Jan 16 17:59:03.226000 audit[4352]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001b0168 a2=98 a3=0 items=0 ppid=4310 pid=4352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:03.226000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653961336632373332376232303130303433663331343937333766 Jan 16 17:59:03.227000 audit: BPF prog-id=217 op=UNLOAD Jan 16 17:59:03.227000 audit[4352]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4310 pid=4352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:03.227000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653961336632373332376232303130303433663331343937333766 Jan 16 17:59:03.227000 audit: BPF prog-id=216 op=UNLOAD Jan 16 17:59:03.227000 audit[4352]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4310 pid=4352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:03.227000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653961336632373332376232303130303433663331343937333766 Jan 16 17:59:03.227000 audit: BPF prog-id=218 op=LOAD Jan 16 17:59:03.227000 audit[4352]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001b0648 a2=98 a3=0 items=0 ppid=4310 pid=4352 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:03.227000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3332653961336632373332376232303130303433663331343937333766 Jan 16 17:59:03.267979 containerd[1547]: time="2026-01-16T17:59:03.267262986Z" level=info msg="StartContainer for \"32e9a3f27327b2010043f3149737fd82912b4ebb1e81887eec872c7398b57923\" returns successfully" Jan 16 17:59:03.706945 systemd-networkd[1470]: vxlan.calico: Gained IPv6LL Jan 16 17:59:03.748972 containerd[1547]: time="2026-01-16T17:59:03.748918137Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-lqf7c,Uid:8f14bccb-f353-467f-b549-674ae9114a0e,Namespace:calico-system,Attempt:0,}" Jan 16 17:59:03.896829 systemd-networkd[1470]: cali7d0eafef29a: Link UP Jan 16 17:59:03.897820 systemd-networkd[1470]: cali7d0eafef29a: Gained carrier Jan 16 17:59:03.917942 containerd[1547]: 2026-01-16 17:59:03.803 [INFO][4387] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--03fd9ab712-k8s-csi--node--driver--lqf7c-eth0 csi-node-driver- calico-system 8f14bccb-f353-467f-b549-674ae9114a0e 729 0 2026-01-16 17:58:40 +0000 UTC map[app.kubernetes.io/name:csi-node-driver controller-revision-hash:9d99788f7 k8s-app:csi-node-driver name:csi-node-driver pod-template-generation:1 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:csi-node-driver] map[] [] [] []} {k8s ci-4580-0-0-p-03fd9ab712 csi-node-driver-lqf7c eth0 csi-node-driver [] [] [kns.calico-system ksa.calico-system.csi-node-driver] cali7d0eafef29a [] [] }} ContainerID="c101b8f5fd82c063ea8e560c8d84db07e3f17b910a7e24173dd2a42f5e2ccc1a" Namespace="calico-system" Pod="csi-node-driver-lqf7c" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-csi--node--driver--lqf7c-" Jan 16 17:59:03.917942 containerd[1547]: 2026-01-16 17:59:03.803 [INFO][4387] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="c101b8f5fd82c063ea8e560c8d84db07e3f17b910a7e24173dd2a42f5e2ccc1a" Namespace="calico-system" Pod="csi-node-driver-lqf7c" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-csi--node--driver--lqf7c-eth0" Jan 16 17:59:03.917942 containerd[1547]: 2026-01-16 17:59:03.831 [INFO][4398] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="c101b8f5fd82c063ea8e560c8d84db07e3f17b910a7e24173dd2a42f5e2ccc1a" HandleID="k8s-pod-network.c101b8f5fd82c063ea8e560c8d84db07e3f17b910a7e24173dd2a42f5e2ccc1a" Workload="ci--4580--0--0--p--03fd9ab712-k8s-csi--node--driver--lqf7c-eth0" Jan 16 17:59:03.917942 containerd[1547]: 2026-01-16 17:59:03.831 [INFO][4398] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="c101b8f5fd82c063ea8e560c8d84db07e3f17b910a7e24173dd2a42f5e2ccc1a" HandleID="k8s-pod-network.c101b8f5fd82c063ea8e560c8d84db07e3f17b910a7e24173dd2a42f5e2ccc1a" Workload="ci--4580--0--0--p--03fd9ab712-k8s-csi--node--driver--lqf7c-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x40002d2fe0), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4580-0-0-p-03fd9ab712", "pod":"csi-node-driver-lqf7c", "timestamp":"2026-01-16 17:59:03.831255119 +0000 UTC"}, Hostname:"ci-4580-0-0-p-03fd9ab712", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 16 17:59:03.917942 containerd[1547]: 2026-01-16 17:59:03.831 [INFO][4398] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 16 17:59:03.917942 containerd[1547]: 2026-01-16 17:59:03.831 [INFO][4398] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 16 17:59:03.917942 containerd[1547]: 2026-01-16 17:59:03.831 [INFO][4398] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-03fd9ab712' Jan 16 17:59:03.917942 containerd[1547]: 2026-01-16 17:59:03.847 [INFO][4398] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.c101b8f5fd82c063ea8e560c8d84db07e3f17b910a7e24173dd2a42f5e2ccc1a" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:03.917942 containerd[1547]: 2026-01-16 17:59:03.854 [INFO][4398] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:03.917942 containerd[1547]: 2026-01-16 17:59:03.862 [INFO][4398] ipam/ipam.go 511: Trying affinity for 192.168.125.0/26 host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:03.917942 containerd[1547]: 2026-01-16 17:59:03.866 [INFO][4398] ipam/ipam.go 158: Attempting to load block cidr=192.168.125.0/26 host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:03.917942 containerd[1547]: 2026-01-16 17:59:03.869 [INFO][4398] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.125.0/26 host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:03.917942 containerd[1547]: 2026-01-16 17:59:03.869 [INFO][4398] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.125.0/26 handle="k8s-pod-network.c101b8f5fd82c063ea8e560c8d84db07e3f17b910a7e24173dd2a42f5e2ccc1a" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:03.917942 containerd[1547]: 2026-01-16 17:59:03.872 [INFO][4398] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.c101b8f5fd82c063ea8e560c8d84db07e3f17b910a7e24173dd2a42f5e2ccc1a Jan 16 17:59:03.917942 containerd[1547]: 2026-01-16 17:59:03.877 [INFO][4398] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.125.0/26 handle="k8s-pod-network.c101b8f5fd82c063ea8e560c8d84db07e3f17b910a7e24173dd2a42f5e2ccc1a" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:03.917942 containerd[1547]: 2026-01-16 17:59:03.886 [INFO][4398] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.125.3/26] block=192.168.125.0/26 handle="k8s-pod-network.c101b8f5fd82c063ea8e560c8d84db07e3f17b910a7e24173dd2a42f5e2ccc1a" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:03.917942 containerd[1547]: 2026-01-16 17:59:03.887 [INFO][4398] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.125.3/26] handle="k8s-pod-network.c101b8f5fd82c063ea8e560c8d84db07e3f17b910a7e24173dd2a42f5e2ccc1a" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:03.917942 containerd[1547]: 2026-01-16 17:59:03.887 [INFO][4398] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 16 17:59:03.917942 containerd[1547]: 2026-01-16 17:59:03.887 [INFO][4398] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.125.3/26] IPv6=[] ContainerID="c101b8f5fd82c063ea8e560c8d84db07e3f17b910a7e24173dd2a42f5e2ccc1a" HandleID="k8s-pod-network.c101b8f5fd82c063ea8e560c8d84db07e3f17b910a7e24173dd2a42f5e2ccc1a" Workload="ci--4580--0--0--p--03fd9ab712-k8s-csi--node--driver--lqf7c-eth0" Jan 16 17:59:03.918852 containerd[1547]: 2026-01-16 17:59:03.889 [INFO][4387] cni-plugin/k8s.go 418: Populated endpoint ContainerID="c101b8f5fd82c063ea8e560c8d84db07e3f17b910a7e24173dd2a42f5e2ccc1a" Namespace="calico-system" Pod="csi-node-driver-lqf7c" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-csi--node--driver--lqf7c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--03fd9ab712-k8s-csi--node--driver--lqf7c-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"8f14bccb-f353-467f-b549-674ae9114a0e", ResourceVersion:"729", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 17, 58, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-03fd9ab712", ContainerID:"", Pod:"csi-node-driver-lqf7c", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.125.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali7d0eafef29a", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 17:59:03.918852 containerd[1547]: 2026-01-16 17:59:03.889 [INFO][4387] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.125.3/32] ContainerID="c101b8f5fd82c063ea8e560c8d84db07e3f17b910a7e24173dd2a42f5e2ccc1a" Namespace="calico-system" Pod="csi-node-driver-lqf7c" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-csi--node--driver--lqf7c-eth0" Jan 16 17:59:03.918852 containerd[1547]: 2026-01-16 17:59:03.889 [INFO][4387] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali7d0eafef29a ContainerID="c101b8f5fd82c063ea8e560c8d84db07e3f17b910a7e24173dd2a42f5e2ccc1a" Namespace="calico-system" Pod="csi-node-driver-lqf7c" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-csi--node--driver--lqf7c-eth0" Jan 16 17:59:03.918852 containerd[1547]: 2026-01-16 17:59:03.897 [INFO][4387] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="c101b8f5fd82c063ea8e560c8d84db07e3f17b910a7e24173dd2a42f5e2ccc1a" Namespace="calico-system" Pod="csi-node-driver-lqf7c" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-csi--node--driver--lqf7c-eth0" Jan 16 17:59:03.918852 containerd[1547]: 2026-01-16 17:59:03.898 [INFO][4387] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="c101b8f5fd82c063ea8e560c8d84db07e3f17b910a7e24173dd2a42f5e2ccc1a" Namespace="calico-system" Pod="csi-node-driver-lqf7c" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-csi--node--driver--lqf7c-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--03fd9ab712-k8s-csi--node--driver--lqf7c-eth0", GenerateName:"csi-node-driver-", Namespace:"calico-system", SelfLink:"", UID:"8f14bccb-f353-467f-b549-674ae9114a0e", ResourceVersion:"729", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 17, 58, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"csi-node-driver", "controller-revision-hash":"9d99788f7", "k8s-app":"csi-node-driver", "name":"csi-node-driver", "pod-template-generation":"1", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"csi-node-driver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-03fd9ab712", ContainerID:"c101b8f5fd82c063ea8e560c8d84db07e3f17b910a7e24173dd2a42f5e2ccc1a", Pod:"csi-node-driver-lqf7c", Endpoint:"eth0", ServiceAccountName:"csi-node-driver", IPNetworks:[]string{"192.168.125.3/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.csi-node-driver"}, InterfaceName:"cali7d0eafef29a", MAC:"4a:f1:99:87:1e:70", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 17:59:03.918852 containerd[1547]: 2026-01-16 17:59:03.913 [INFO][4387] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="c101b8f5fd82c063ea8e560c8d84db07e3f17b910a7e24173dd2a42f5e2ccc1a" Namespace="calico-system" Pod="csi-node-driver-lqf7c" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-csi--node--driver--lqf7c-eth0" Jan 16 17:59:03.932000 audit[4414]: NETFILTER_CFG table=filter:126 family=2 entries=40 op=nft_register_chain pid=4414 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 17:59:03.932000 audit[4414]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=20764 a0=3 a1=ffffe2d3fe40 a2=0 a3=ffff8b169fa8 items=0 ppid=4060 pid=4414 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:03.932000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 17:59:03.947264 containerd[1547]: time="2026-01-16T17:59:03.947190895Z" level=info msg="connecting to shim c101b8f5fd82c063ea8e560c8d84db07e3f17b910a7e24173dd2a42f5e2ccc1a" address="unix:///run/containerd/s/f3e1000e938d55f53f19e679251bf3ff9f4aa968501599fecbc674058da72f9c" namespace=k8s.io protocol=ttrpc version=3 Jan 16 17:59:03.977790 systemd[1]: Started cri-containerd-c101b8f5fd82c063ea8e560c8d84db07e3f17b910a7e24173dd2a42f5e2ccc1a.scope - libcontainer container c101b8f5fd82c063ea8e560c8d84db07e3f17b910a7e24173dd2a42f5e2ccc1a. Jan 16 17:59:03.999000 audit: BPF prog-id=219 op=LOAD Jan 16 17:59:04.001000 audit: BPF prog-id=220 op=LOAD Jan 16 17:59:04.001000 audit[4435]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=4424 pid=4435 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:04.001000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331303162386635666438326330363365613865353630633864383464 Jan 16 17:59:04.001000 audit: BPF prog-id=220 op=UNLOAD Jan 16 17:59:04.001000 audit[4435]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4424 pid=4435 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:04.001000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331303162386635666438326330363365613865353630633864383464 Jan 16 17:59:04.001000 audit: BPF prog-id=221 op=LOAD Jan 16 17:59:04.001000 audit[4435]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=4424 pid=4435 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:04.001000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331303162386635666438326330363365613865353630633864383464 Jan 16 17:59:04.001000 audit: BPF prog-id=222 op=LOAD Jan 16 17:59:04.001000 audit[4435]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=4424 pid=4435 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:04.001000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331303162386635666438326330363365613865353630633864383464 Jan 16 17:59:04.001000 audit: BPF prog-id=222 op=UNLOAD Jan 16 17:59:04.001000 audit[4435]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4424 pid=4435 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:04.001000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331303162386635666438326330363365613865353630633864383464 Jan 16 17:59:04.001000 audit: BPF prog-id=221 op=UNLOAD Jan 16 17:59:04.001000 audit[4435]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4424 pid=4435 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:04.001000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331303162386635666438326330363365613865353630633864383464 Jan 16 17:59:04.001000 audit: BPF prog-id=223 op=LOAD Jan 16 17:59:04.001000 audit[4435]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=4424 pid=4435 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:04.001000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6331303162386635666438326330363365613865353630633864383464 Jan 16 17:59:04.028929 kubelet[2825]: I0116 17:59:04.028872 2825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-gshmq" podStartSLOduration=50.028834553 podStartE2EDuration="50.028834553s" podCreationTimestamp="2026-01-16 17:58:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-16 17:59:04.027004107 +0000 UTC m=+55.414919962" watchObservedRunningTime="2026-01-16 17:59:04.028834553 +0000 UTC m=+55.416750368" Jan 16 17:59:04.053753 containerd[1547]: time="2026-01-16T17:59:04.053677201Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:csi-node-driver-lqf7c,Uid:8f14bccb-f353-467f-b549-674ae9114a0e,Namespace:calico-system,Attempt:0,} returns sandbox id \"c101b8f5fd82c063ea8e560c8d84db07e3f17b910a7e24173dd2a42f5e2ccc1a\"" Jan 16 17:59:04.057610 containerd[1547]: time="2026-01-16T17:59:04.057569584Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 16 17:59:04.075000 audit[4463]: NETFILTER_CFG table=filter:127 family=2 entries=20 op=nft_register_rule pid=4463 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:59:04.075000 audit[4463]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=7480 a0=3 a1=ffffead124e0 a2=0 a3=1 items=0 ppid=2976 pid=4463 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:04.075000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:59:04.080000 audit[4463]: NETFILTER_CFG table=nat:128 family=2 entries=14 op=nft_register_rule pid=4463 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:59:04.080000 audit[4463]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=3468 a0=3 a1=ffffead124e0 a2=0 a3=1 items=0 ppid=2976 pid=4463 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:04.080000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:59:04.402983 containerd[1547]: time="2026-01-16T17:59:04.402445513Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 17:59:04.406053 containerd[1547]: time="2026-01-16T17:59:04.405901435Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 16 17:59:04.406053 containerd[1547]: time="2026-01-16T17:59:04.405902675Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 16 17:59:04.406489 kubelet[2825]: E0116 17:59:04.406372 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 16 17:59:04.406635 kubelet[2825]: E0116 17:59:04.406613 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 16 17:59:04.406794 kubelet[2825]: E0116 17:59:04.406771 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-lqf7c_calico-system(8f14bccb-f353-467f-b549-674ae9114a0e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 16 17:59:04.409568 containerd[1547]: time="2026-01-16T17:59:04.409428721Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 16 17:59:04.604774 systemd-networkd[1470]: cali5d9c0a47872: Gained IPv6LL Jan 16 17:59:04.734268 containerd[1547]: time="2026-01-16T17:59:04.734198225Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 17:59:04.735887 containerd[1547]: time="2026-01-16T17:59:04.735806461Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 16 17:59:04.735990 containerd[1547]: time="2026-01-16T17:59:04.735929787Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 16 17:59:04.736257 kubelet[2825]: E0116 17:59:04.736156 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 16 17:59:04.736340 kubelet[2825]: E0116 17:59:04.736255 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 16 17:59:04.736426 kubelet[2825]: E0116 17:59:04.736360 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-lqf7c_calico-system(8f14bccb-f353-467f-b549-674ae9114a0e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 16 17:59:04.736511 kubelet[2825]: E0116 17:59:04.736465 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lqf7c" podUID="8f14bccb-f353-467f-b549-674ae9114a0e" Jan 16 17:59:04.749641 containerd[1547]: time="2026-01-16T17:59:04.749507345Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-767f75bcf4-k622k,Uid:e2b5c1b0-147f-4d63-9d4a-0d66468ad158,Namespace:calico-apiserver,Attempt:0,}" Jan 16 17:59:04.751913 containerd[1547]: time="2026-01-16T17:59:04.751537840Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-767f75bcf4-xk99t,Uid:d584925c-30a0-4760-b407-e5a8a40d9f3d,Namespace:calico-apiserver,Attempt:0,}" Jan 16 17:59:04.906699 systemd-networkd[1470]: cali527513e87aa: Link UP Jan 16 17:59:04.908077 systemd-networkd[1470]: cali527513e87aa: Gained carrier Jan 16 17:59:04.934043 containerd[1547]: 2026-01-16 17:59:04.804 [INFO][4465] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--03fd9ab712-k8s-calico--apiserver--767f75bcf4--k622k-eth0 calico-apiserver-767f75bcf4- calico-apiserver e2b5c1b0-147f-4d63-9d4a-0d66468ad158 845 0 2026-01-16 17:58:31 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:767f75bcf4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4580-0-0-p-03fd9ab712 calico-apiserver-767f75bcf4-k622k eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali527513e87aa [] [] }} ContainerID="0ff5cbdb518fe3dc0f8a871c92248d9de09d8881741da3b358ed84bb05c7069f" Namespace="calico-apiserver" Pod="calico-apiserver-767f75bcf4-k622k" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-calico--apiserver--767f75bcf4--k622k-" Jan 16 17:59:04.934043 containerd[1547]: 2026-01-16 17:59:04.804 [INFO][4465] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="0ff5cbdb518fe3dc0f8a871c92248d9de09d8881741da3b358ed84bb05c7069f" Namespace="calico-apiserver" Pod="calico-apiserver-767f75bcf4-k622k" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-calico--apiserver--767f75bcf4--k622k-eth0" Jan 16 17:59:04.934043 containerd[1547]: 2026-01-16 17:59:04.846 [INFO][4489] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="0ff5cbdb518fe3dc0f8a871c92248d9de09d8881741da3b358ed84bb05c7069f" HandleID="k8s-pod-network.0ff5cbdb518fe3dc0f8a871c92248d9de09d8881741da3b358ed84bb05c7069f" Workload="ci--4580--0--0--p--03fd9ab712-k8s-calico--apiserver--767f75bcf4--k622k-eth0" Jan 16 17:59:04.934043 containerd[1547]: 2026-01-16 17:59:04.846 [INFO][4489] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="0ff5cbdb518fe3dc0f8a871c92248d9de09d8881741da3b358ed84bb05c7069f" HandleID="k8s-pod-network.0ff5cbdb518fe3dc0f8a871c92248d9de09d8881741da3b358ed84bb05c7069f" Workload="ci--4580--0--0--p--03fd9ab712-k8s-calico--apiserver--767f75bcf4--k622k-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b820), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4580-0-0-p-03fd9ab712", "pod":"calico-apiserver-767f75bcf4-k622k", "timestamp":"2026-01-16 17:59:04.846253972 +0000 UTC"}, Hostname:"ci-4580-0-0-p-03fd9ab712", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 16 17:59:04.934043 containerd[1547]: 2026-01-16 17:59:04.846 [INFO][4489] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 16 17:59:04.934043 containerd[1547]: 2026-01-16 17:59:04.846 [INFO][4489] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 16 17:59:04.934043 containerd[1547]: 2026-01-16 17:59:04.846 [INFO][4489] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-03fd9ab712' Jan 16 17:59:04.934043 containerd[1547]: 2026-01-16 17:59:04.857 [INFO][4489] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.0ff5cbdb518fe3dc0f8a871c92248d9de09d8881741da3b358ed84bb05c7069f" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:04.934043 containerd[1547]: 2026-01-16 17:59:04.864 [INFO][4489] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:04.934043 containerd[1547]: 2026-01-16 17:59:04.870 [INFO][4489] ipam/ipam.go 511: Trying affinity for 192.168.125.0/26 host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:04.934043 containerd[1547]: 2026-01-16 17:59:04.872 [INFO][4489] ipam/ipam.go 158: Attempting to load block cidr=192.168.125.0/26 host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:04.934043 containerd[1547]: 2026-01-16 17:59:04.875 [INFO][4489] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.125.0/26 host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:04.934043 containerd[1547]: 2026-01-16 17:59:04.875 [INFO][4489] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.125.0/26 handle="k8s-pod-network.0ff5cbdb518fe3dc0f8a871c92248d9de09d8881741da3b358ed84bb05c7069f" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:04.934043 containerd[1547]: 2026-01-16 17:59:04.877 [INFO][4489] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.0ff5cbdb518fe3dc0f8a871c92248d9de09d8881741da3b358ed84bb05c7069f Jan 16 17:59:04.934043 containerd[1547]: 2026-01-16 17:59:04.882 [INFO][4489] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.125.0/26 handle="k8s-pod-network.0ff5cbdb518fe3dc0f8a871c92248d9de09d8881741da3b358ed84bb05c7069f" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:04.934043 containerd[1547]: 2026-01-16 17:59:04.892 [INFO][4489] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.125.4/26] block=192.168.125.0/26 handle="k8s-pod-network.0ff5cbdb518fe3dc0f8a871c92248d9de09d8881741da3b358ed84bb05c7069f" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:04.934043 containerd[1547]: 2026-01-16 17:59:04.893 [INFO][4489] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.125.4/26] handle="k8s-pod-network.0ff5cbdb518fe3dc0f8a871c92248d9de09d8881741da3b358ed84bb05c7069f" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:04.934043 containerd[1547]: 2026-01-16 17:59:04.893 [INFO][4489] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 16 17:59:04.934043 containerd[1547]: 2026-01-16 17:59:04.893 [INFO][4489] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.125.4/26] IPv6=[] ContainerID="0ff5cbdb518fe3dc0f8a871c92248d9de09d8881741da3b358ed84bb05c7069f" HandleID="k8s-pod-network.0ff5cbdb518fe3dc0f8a871c92248d9de09d8881741da3b358ed84bb05c7069f" Workload="ci--4580--0--0--p--03fd9ab712-k8s-calico--apiserver--767f75bcf4--k622k-eth0" Jan 16 17:59:04.935701 containerd[1547]: 2026-01-16 17:59:04.896 [INFO][4465] cni-plugin/k8s.go 418: Populated endpoint ContainerID="0ff5cbdb518fe3dc0f8a871c92248d9de09d8881741da3b358ed84bb05c7069f" Namespace="calico-apiserver" Pod="calico-apiserver-767f75bcf4-k622k" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-calico--apiserver--767f75bcf4--k622k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--03fd9ab712-k8s-calico--apiserver--767f75bcf4--k622k-eth0", GenerateName:"calico-apiserver-767f75bcf4-", Namespace:"calico-apiserver", SelfLink:"", UID:"e2b5c1b0-147f-4d63-9d4a-0d66468ad158", ResourceVersion:"845", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 17, 58, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"767f75bcf4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-03fd9ab712", ContainerID:"", Pod:"calico-apiserver-767f75bcf4-k622k", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.125.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali527513e87aa", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 17:59:04.935701 containerd[1547]: 2026-01-16 17:59:04.897 [INFO][4465] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.125.4/32] ContainerID="0ff5cbdb518fe3dc0f8a871c92248d9de09d8881741da3b358ed84bb05c7069f" Namespace="calico-apiserver" Pod="calico-apiserver-767f75bcf4-k622k" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-calico--apiserver--767f75bcf4--k622k-eth0" Jan 16 17:59:04.935701 containerd[1547]: 2026-01-16 17:59:04.897 [INFO][4465] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali527513e87aa ContainerID="0ff5cbdb518fe3dc0f8a871c92248d9de09d8881741da3b358ed84bb05c7069f" Namespace="calico-apiserver" Pod="calico-apiserver-767f75bcf4-k622k" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-calico--apiserver--767f75bcf4--k622k-eth0" Jan 16 17:59:04.935701 containerd[1547]: 2026-01-16 17:59:04.908 [INFO][4465] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="0ff5cbdb518fe3dc0f8a871c92248d9de09d8881741da3b358ed84bb05c7069f" Namespace="calico-apiserver" Pod="calico-apiserver-767f75bcf4-k622k" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-calico--apiserver--767f75bcf4--k622k-eth0" Jan 16 17:59:04.935701 containerd[1547]: 2026-01-16 17:59:04.909 [INFO][4465] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="0ff5cbdb518fe3dc0f8a871c92248d9de09d8881741da3b358ed84bb05c7069f" Namespace="calico-apiserver" Pod="calico-apiserver-767f75bcf4-k622k" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-calico--apiserver--767f75bcf4--k622k-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--03fd9ab712-k8s-calico--apiserver--767f75bcf4--k622k-eth0", GenerateName:"calico-apiserver-767f75bcf4-", Namespace:"calico-apiserver", SelfLink:"", UID:"e2b5c1b0-147f-4d63-9d4a-0d66468ad158", ResourceVersion:"845", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 17, 58, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"767f75bcf4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-03fd9ab712", ContainerID:"0ff5cbdb518fe3dc0f8a871c92248d9de09d8881741da3b358ed84bb05c7069f", Pod:"calico-apiserver-767f75bcf4-k622k", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.125.4/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali527513e87aa", MAC:"06:84:1d:7f:6f:9e", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 17:59:04.935701 containerd[1547]: 2026-01-16 17:59:04.928 [INFO][4465] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="0ff5cbdb518fe3dc0f8a871c92248d9de09d8881741da3b358ed84bb05c7069f" Namespace="calico-apiserver" Pod="calico-apiserver-767f75bcf4-k622k" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-calico--apiserver--767f75bcf4--k622k-eth0" Jan 16 17:59:04.948000 audit[4511]: NETFILTER_CFG table=filter:129 family=2 entries=64 op=nft_register_chain pid=4511 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 17:59:04.950799 kernel: kauditd_printk_skb: 309 callbacks suppressed Jan 16 17:59:04.950895 kernel: audit: type=1325 audit(1768586344.948:673): table=filter:129 family=2 entries=64 op=nft_register_chain pid=4511 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 17:59:04.948000 audit[4511]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=33436 a0=3 a1=ffffca448ef0 a2=0 a3=ffff8eeacfa8 items=0 ppid=4060 pid=4511 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:04.957451 kernel: audit: type=1300 audit(1768586344.948:673): arch=c00000b7 syscall=211 success=yes exit=33436 a0=3 a1=ffffca448ef0 a2=0 a3=ffff8eeacfa8 items=0 ppid=4060 pid=4511 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:04.948000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 17:59:04.960603 kernel: audit: type=1327 audit(1768586344.948:673): proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 17:59:04.972833 containerd[1547]: time="2026-01-16T17:59:04.972791759Z" level=info msg="connecting to shim 0ff5cbdb518fe3dc0f8a871c92248d9de09d8881741da3b358ed84bb05c7069f" address="unix:///run/containerd/s/5ab01f23426b2347c8248f27303e2b11a8e30981e2a74f8e4bb30036192d689b" namespace=k8s.io protocol=ttrpc version=3 Jan 16 17:59:05.009823 systemd[1]: Started cri-containerd-0ff5cbdb518fe3dc0f8a871c92248d9de09d8881741da3b358ed84bb05c7069f.scope - libcontainer container 0ff5cbdb518fe3dc0f8a871c92248d9de09d8881741da3b358ed84bb05c7069f. Jan 16 17:59:05.021173 kubelet[2825]: E0116 17:59:05.020806 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lqf7c" podUID="8f14bccb-f353-467f-b549-674ae9114a0e" Jan 16 17:59:05.035443 systemd-networkd[1470]: cali933c0f7c4ba: Link UP Jan 16 17:59:05.036134 systemd-networkd[1470]: cali933c0f7c4ba: Gained carrier Jan 16 17:59:05.063292 containerd[1547]: 2026-01-16 17:59:04.809 [INFO][4471] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--03fd9ab712-k8s-calico--apiserver--767f75bcf4--xk99t-eth0 calico-apiserver-767f75bcf4- calico-apiserver d584925c-30a0-4760-b407-e5a8a40d9f3d 842 0 2026-01-16 17:58:31 +0000 UTC map[apiserver:true app.kubernetes.io/name:calico-apiserver k8s-app:calico-apiserver pod-template-hash:767f75bcf4 projectcalico.org/namespace:calico-apiserver projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-apiserver] map[] [] [] []} {k8s ci-4580-0-0-p-03fd9ab712 calico-apiserver-767f75bcf4-xk99t eth0 calico-apiserver [] [] [kns.calico-apiserver ksa.calico-apiserver.calico-apiserver] cali933c0f7c4ba [] [] }} ContainerID="3cdc0e6f480c5d3bc4e11049171c43b62fc67830314c6fbdbecb9df5ecbdfd60" Namespace="calico-apiserver" Pod="calico-apiserver-767f75bcf4-xk99t" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-calico--apiserver--767f75bcf4--xk99t-" Jan 16 17:59:05.063292 containerd[1547]: 2026-01-16 17:59:04.809 [INFO][4471] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="3cdc0e6f480c5d3bc4e11049171c43b62fc67830314c6fbdbecb9df5ecbdfd60" Namespace="calico-apiserver" Pod="calico-apiserver-767f75bcf4-xk99t" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-calico--apiserver--767f75bcf4--xk99t-eth0" Jan 16 17:59:05.063292 containerd[1547]: 2026-01-16 17:59:04.851 [INFO][4494] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="3cdc0e6f480c5d3bc4e11049171c43b62fc67830314c6fbdbecb9df5ecbdfd60" HandleID="k8s-pod-network.3cdc0e6f480c5d3bc4e11049171c43b62fc67830314c6fbdbecb9df5ecbdfd60" Workload="ci--4580--0--0--p--03fd9ab712-k8s-calico--apiserver--767f75bcf4--xk99t-eth0" Jan 16 17:59:05.063292 containerd[1547]: 2026-01-16 17:59:04.851 [INFO][4494] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="3cdc0e6f480c5d3bc4e11049171c43b62fc67830314c6fbdbecb9df5ecbdfd60" HandleID="k8s-pod-network.3cdc0e6f480c5d3bc4e11049171c43b62fc67830314c6fbdbecb9df5ecbdfd60" Workload="ci--4580--0--0--p--03fd9ab712-k8s-calico--apiserver--767f75bcf4--xk99t-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b080), Attrs:map[string]string{"namespace":"calico-apiserver", "node":"ci-4580-0-0-p-03fd9ab712", "pod":"calico-apiserver-767f75bcf4-xk99t", "timestamp":"2026-01-16 17:59:04.85111008 +0000 UTC"}, Hostname:"ci-4580-0-0-p-03fd9ab712", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 16 17:59:05.063292 containerd[1547]: 2026-01-16 17:59:04.851 [INFO][4494] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 16 17:59:05.063292 containerd[1547]: 2026-01-16 17:59:04.893 [INFO][4494] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 16 17:59:05.063292 containerd[1547]: 2026-01-16 17:59:04.894 [INFO][4494] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-03fd9ab712' Jan 16 17:59:05.063292 containerd[1547]: 2026-01-16 17:59:04.961 [INFO][4494] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.3cdc0e6f480c5d3bc4e11049171c43b62fc67830314c6fbdbecb9df5ecbdfd60" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:05.063292 containerd[1547]: 2026-01-16 17:59:04.967 [INFO][4494] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:05.063292 containerd[1547]: 2026-01-16 17:59:04.974 [INFO][4494] ipam/ipam.go 511: Trying affinity for 192.168.125.0/26 host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:05.063292 containerd[1547]: 2026-01-16 17:59:04.978 [INFO][4494] ipam/ipam.go 158: Attempting to load block cidr=192.168.125.0/26 host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:05.063292 containerd[1547]: 2026-01-16 17:59:04.986 [INFO][4494] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.125.0/26 host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:05.063292 containerd[1547]: 2026-01-16 17:59:04.988 [INFO][4494] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.125.0/26 handle="k8s-pod-network.3cdc0e6f480c5d3bc4e11049171c43b62fc67830314c6fbdbecb9df5ecbdfd60" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:05.063292 containerd[1547]: 2026-01-16 17:59:04.993 [INFO][4494] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.3cdc0e6f480c5d3bc4e11049171c43b62fc67830314c6fbdbecb9df5ecbdfd60 Jan 16 17:59:05.063292 containerd[1547]: 2026-01-16 17:59:05.004 [INFO][4494] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.125.0/26 handle="k8s-pod-network.3cdc0e6f480c5d3bc4e11049171c43b62fc67830314c6fbdbecb9df5ecbdfd60" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:05.063292 containerd[1547]: 2026-01-16 17:59:05.022 [INFO][4494] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.125.5/26] block=192.168.125.0/26 handle="k8s-pod-network.3cdc0e6f480c5d3bc4e11049171c43b62fc67830314c6fbdbecb9df5ecbdfd60" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:05.063292 containerd[1547]: 2026-01-16 17:59:05.022 [INFO][4494] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.125.5/26] handle="k8s-pod-network.3cdc0e6f480c5d3bc4e11049171c43b62fc67830314c6fbdbecb9df5ecbdfd60" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:05.063292 containerd[1547]: 2026-01-16 17:59:05.022 [INFO][4494] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 16 17:59:05.063292 containerd[1547]: 2026-01-16 17:59:05.022 [INFO][4494] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.125.5/26] IPv6=[] ContainerID="3cdc0e6f480c5d3bc4e11049171c43b62fc67830314c6fbdbecb9df5ecbdfd60" HandleID="k8s-pod-network.3cdc0e6f480c5d3bc4e11049171c43b62fc67830314c6fbdbecb9df5ecbdfd60" Workload="ci--4580--0--0--p--03fd9ab712-k8s-calico--apiserver--767f75bcf4--xk99t-eth0" Jan 16 17:59:05.065901 containerd[1547]: 2026-01-16 17:59:05.026 [INFO][4471] cni-plugin/k8s.go 418: Populated endpoint ContainerID="3cdc0e6f480c5d3bc4e11049171c43b62fc67830314c6fbdbecb9df5ecbdfd60" Namespace="calico-apiserver" Pod="calico-apiserver-767f75bcf4-xk99t" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-calico--apiserver--767f75bcf4--xk99t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--03fd9ab712-k8s-calico--apiserver--767f75bcf4--xk99t-eth0", GenerateName:"calico-apiserver-767f75bcf4-", Namespace:"calico-apiserver", SelfLink:"", UID:"d584925c-30a0-4760-b407-e5a8a40d9f3d", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 17, 58, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"767f75bcf4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-03fd9ab712", ContainerID:"", Pod:"calico-apiserver-767f75bcf4-xk99t", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.125.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali933c0f7c4ba", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 17:59:05.065901 containerd[1547]: 2026-01-16 17:59:05.026 [INFO][4471] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.125.5/32] ContainerID="3cdc0e6f480c5d3bc4e11049171c43b62fc67830314c6fbdbecb9df5ecbdfd60" Namespace="calico-apiserver" Pod="calico-apiserver-767f75bcf4-xk99t" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-calico--apiserver--767f75bcf4--xk99t-eth0" Jan 16 17:59:05.065901 containerd[1547]: 2026-01-16 17:59:05.026 [INFO][4471] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali933c0f7c4ba ContainerID="3cdc0e6f480c5d3bc4e11049171c43b62fc67830314c6fbdbecb9df5ecbdfd60" Namespace="calico-apiserver" Pod="calico-apiserver-767f75bcf4-xk99t" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-calico--apiserver--767f75bcf4--xk99t-eth0" Jan 16 17:59:05.065901 containerd[1547]: 2026-01-16 17:59:05.034 [INFO][4471] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="3cdc0e6f480c5d3bc4e11049171c43b62fc67830314c6fbdbecb9df5ecbdfd60" Namespace="calico-apiserver" Pod="calico-apiserver-767f75bcf4-xk99t" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-calico--apiserver--767f75bcf4--xk99t-eth0" Jan 16 17:59:05.065901 containerd[1547]: 2026-01-16 17:59:05.035 [INFO][4471] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="3cdc0e6f480c5d3bc4e11049171c43b62fc67830314c6fbdbecb9df5ecbdfd60" Namespace="calico-apiserver" Pod="calico-apiserver-767f75bcf4-xk99t" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-calico--apiserver--767f75bcf4--xk99t-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--03fd9ab712-k8s-calico--apiserver--767f75bcf4--xk99t-eth0", GenerateName:"calico-apiserver-767f75bcf4-", Namespace:"calico-apiserver", SelfLink:"", UID:"d584925c-30a0-4760-b407-e5a8a40d9f3d", ResourceVersion:"842", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 17, 58, 31, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"apiserver":"true", "app.kubernetes.io/name":"calico-apiserver", "k8s-app":"calico-apiserver", "pod-template-hash":"767f75bcf4", "projectcalico.org/namespace":"calico-apiserver", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-apiserver"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-03fd9ab712", ContainerID:"3cdc0e6f480c5d3bc4e11049171c43b62fc67830314c6fbdbecb9df5ecbdfd60", Pod:"calico-apiserver-767f75bcf4-xk99t", Endpoint:"eth0", ServiceAccountName:"calico-apiserver", IPNetworks:[]string{"192.168.125.5/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-apiserver", "ksa.calico-apiserver.calico-apiserver"}, InterfaceName:"cali933c0f7c4ba", MAC:"d2:ea:68:7f:d1:39", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 17:59:05.065901 containerd[1547]: 2026-01-16 17:59:05.056 [INFO][4471] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="3cdc0e6f480c5d3bc4e11049171c43b62fc67830314c6fbdbecb9df5ecbdfd60" Namespace="calico-apiserver" Pod="calico-apiserver-767f75bcf4-xk99t" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-calico--apiserver--767f75bcf4--xk99t-eth0" Jan 16 17:59:05.075000 audit: BPF prog-id=224 op=LOAD Jan 16 17:59:05.076691 kernel: audit: type=1334 audit(1768586345.075:674): prog-id=224 op=LOAD Jan 16 17:59:05.075000 audit: BPF prog-id=225 op=LOAD Jan 16 17:59:05.075000 audit[4533]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=4521 pid=4533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:05.080528 kernel: audit: type=1334 audit(1768586345.075:675): prog-id=225 op=LOAD Jan 16 17:59:05.080628 kernel: audit: type=1300 audit(1768586345.075:675): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0180 a2=98 a3=0 items=0 ppid=4521 pid=4533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:05.075000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3066663563626462353138666533646330663861383731633932323438 Jan 16 17:59:05.084071 kernel: audit: type=1327 audit(1768586345.075:675): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3066663563626462353138666533646330663861383731633932323438 Jan 16 17:59:05.076000 audit: BPF prog-id=225 op=UNLOAD Jan 16 17:59:05.084838 kernel: audit: type=1334 audit(1768586345.076:676): prog-id=225 op=UNLOAD Jan 16 17:59:05.076000 audit[4533]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4521 pid=4533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:05.087302 kernel: audit: type=1300 audit(1768586345.076:676): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4521 pid=4533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:05.076000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3066663563626462353138666533646330663861383731633932323438 Jan 16 17:59:05.093584 kernel: audit: type=1327 audit(1768586345.076:676): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3066663563626462353138666533646330663861383731633932323438 Jan 16 17:59:05.076000 audit: BPF prog-id=226 op=LOAD Jan 16 17:59:05.076000 audit[4533]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a03e8 a2=98 a3=0 items=0 ppid=4521 pid=4533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:05.076000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3066663563626462353138666533646330663861383731633932323438 Jan 16 17:59:05.076000 audit: BPF prog-id=227 op=LOAD Jan 16 17:59:05.076000 audit[4533]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a0168 a2=98 a3=0 items=0 ppid=4521 pid=4533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:05.076000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3066663563626462353138666533646330663861383731633932323438 Jan 16 17:59:05.076000 audit: BPF prog-id=227 op=UNLOAD Jan 16 17:59:05.076000 audit[4533]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4521 pid=4533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:05.076000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3066663563626462353138666533646330663861383731633932323438 Jan 16 17:59:05.076000 audit: BPF prog-id=226 op=UNLOAD Jan 16 17:59:05.076000 audit[4533]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4521 pid=4533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:05.076000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3066663563626462353138666533646330663861383731633932323438 Jan 16 17:59:05.076000 audit: BPF prog-id=228 op=LOAD Jan 16 17:59:05.076000 audit[4533]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a0648 a2=98 a3=0 items=0 ppid=4521 pid=4533 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:05.076000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3066663563626462353138666533646330663861383731633932323438 Jan 16 17:59:05.097000 audit[4560]: NETFILTER_CFG table=filter:130 family=2 entries=45 op=nft_register_chain pid=4560 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 17:59:05.097000 audit[4560]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=24248 a0=3 a1=ffffe681ad40 a2=0 a3=ffffa4b29fa8 items=0 ppid=4060 pid=4560 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:05.097000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 17:59:05.107829 containerd[1547]: time="2026-01-16T17:59:05.107775503Z" level=info msg="connecting to shim 3cdc0e6f480c5d3bc4e11049171c43b62fc67830314c6fbdbecb9df5ecbdfd60" address="unix:///run/containerd/s/54f5757389b26ed37b5edf2dacea60bca2c7f971ed1958f09d1b32408f6176e6" namespace=k8s.io protocol=ttrpc version=3 Jan 16 17:59:05.124000 audit[4583]: NETFILTER_CFG table=filter:131 family=2 entries=17 op=nft_register_rule pid=4583 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:59:05.124000 audit[4583]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=fffff8dfc6f0 a2=0 a3=1 items=0 ppid=2976 pid=4583 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:05.124000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:59:05.129000 audit[4583]: NETFILTER_CFG table=nat:132 family=2 entries=35 op=nft_register_chain pid=4583 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:59:05.129000 audit[4583]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14196 a0=3 a1=fffff8dfc6f0 a2=0 a3=1 items=0 ppid=2976 pid=4583 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:05.129000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:59:05.149167 containerd[1547]: time="2026-01-16T17:59:05.149129711Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-767f75bcf4-k622k,Uid:e2b5c1b0-147f-4d63-9d4a-0d66468ad158,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"0ff5cbdb518fe3dc0f8a871c92248d9de09d8881741da3b358ed84bb05c7069f\"" Jan 16 17:59:05.152405 containerd[1547]: time="2026-01-16T17:59:05.152361342Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 16 17:59:05.154064 systemd[1]: Started cri-containerd-3cdc0e6f480c5d3bc4e11049171c43b62fc67830314c6fbdbecb9df5ecbdfd60.scope - libcontainer container 3cdc0e6f480c5d3bc4e11049171c43b62fc67830314c6fbdbecb9df5ecbdfd60. Jan 16 17:59:05.177000 audit: BPF prog-id=229 op=LOAD Jan 16 17:59:05.178000 audit: BPF prog-id=230 op=LOAD Jan 16 17:59:05.178000 audit[4582]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=4570 pid=4582 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:05.178000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363646330653666343830633564336263346531313034393137316334 Jan 16 17:59:05.178000 audit: BPF prog-id=230 op=UNLOAD Jan 16 17:59:05.178000 audit[4582]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4570 pid=4582 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:05.178000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363646330653666343830633564336263346531313034393137316334 Jan 16 17:59:05.178000 audit: BPF prog-id=231 op=LOAD Jan 16 17:59:05.178000 audit[4582]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=4570 pid=4582 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:05.178000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363646330653666343830633564336263346531313034393137316334 Jan 16 17:59:05.179000 audit: BPF prog-id=232 op=LOAD Jan 16 17:59:05.179000 audit[4582]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=4570 pid=4582 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:05.179000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363646330653666343830633564336263346531313034393137316334 Jan 16 17:59:05.179000 audit: BPF prog-id=232 op=UNLOAD Jan 16 17:59:05.179000 audit[4582]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4570 pid=4582 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:05.179000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363646330653666343830633564336263346531313034393137316334 Jan 16 17:59:05.179000 audit: BPF prog-id=231 op=UNLOAD Jan 16 17:59:05.179000 audit[4582]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4570 pid=4582 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:05.179000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363646330653666343830633564336263346531313034393137316334 Jan 16 17:59:05.179000 audit: BPF prog-id=233 op=LOAD Jan 16 17:59:05.179000 audit[4582]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=4570 pid=4582 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:05.179000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3363646330653666343830633564336263346531313034393137316334 Jan 16 17:59:05.208928 containerd[1547]: time="2026-01-16T17:59:05.208887897Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-apiserver-767f75bcf4-xk99t,Uid:d584925c-30a0-4760-b407-e5a8a40d9f3d,Namespace:calico-apiserver,Attempt:0,} returns sandbox id \"3cdc0e6f480c5d3bc4e11049171c43b62fc67830314c6fbdbecb9df5ecbdfd60\"" Jan 16 17:59:05.306833 systemd-networkd[1470]: cali7d0eafef29a: Gained IPv6LL Jan 16 17:59:05.474130 containerd[1547]: time="2026-01-16T17:59:05.474010259Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 17:59:05.475545 containerd[1547]: time="2026-01-16T17:59:05.475423285Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 16 17:59:05.475694 containerd[1547]: time="2026-01-16T17:59:05.475599853Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 16 17:59:05.476015 kubelet[2825]: E0116 17:59:05.475930 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 17:59:05.476015 kubelet[2825]: E0116 17:59:05.475999 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 17:59:05.476691 kubelet[2825]: E0116 17:59:05.476329 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-767f75bcf4-k622k_calico-apiserver(e2b5c1b0-147f-4d63-9d4a-0d66468ad158): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 16 17:59:05.476691 kubelet[2825]: E0116 17:59:05.476396 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-k622k" podUID="e2b5c1b0-147f-4d63-9d4a-0d66468ad158" Jan 16 17:59:05.477490 containerd[1547]: time="2026-01-16T17:59:05.477455259Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 16 17:59:05.748919 containerd[1547]: time="2026-01-16T17:59:05.748785910Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-xpvnz,Uid:42aeda7d-4fe5-4a6e-a477-419675cf7435,Namespace:kube-system,Attempt:0,}" Jan 16 17:59:05.823163 containerd[1547]: time="2026-01-16T17:59:05.822977129Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 17:59:05.824370 containerd[1547]: time="2026-01-16T17:59:05.824287990Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 16 17:59:05.824951 kubelet[2825]: E0116 17:59:05.824832 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 17:59:05.825414 kubelet[2825]: E0116 17:59:05.825291 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 17:59:05.825575 kubelet[2825]: E0116 17:59:05.825398 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-767f75bcf4-xk99t_calico-apiserver(d584925c-30a0-4760-b407-e5a8a40d9f3d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 16 17:59:05.825740 containerd[1547]: time="2026-01-16T17:59:05.824321432Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 16 17:59:05.825791 kubelet[2825]: E0116 17:59:05.825535 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-xk99t" podUID="d584925c-30a0-4760-b407-e5a8a40d9f3d" Jan 16 17:59:05.915640 systemd-networkd[1470]: cali472f9b2c0c8: Link UP Jan 16 17:59:05.920292 systemd-networkd[1470]: cali472f9b2c0c8: Gained carrier Jan 16 17:59:05.947198 containerd[1547]: 2026-01-16 17:59:05.801 [INFO][4616] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--03fd9ab712-k8s-coredns--66bc5c9577--xpvnz-eth0 coredns-66bc5c9577- kube-system 42aeda7d-4fe5-4a6e-a477-419675cf7435 844 0 2026-01-16 17:58:14 +0000 UTC map[k8s-app:kube-dns pod-template-hash:66bc5c9577 projectcalico.org/namespace:kube-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:coredns] map[] [] [] []} {k8s ci-4580-0-0-p-03fd9ab712 coredns-66bc5c9577-xpvnz eth0 coredns [] [] [kns.kube-system ksa.kube-system.coredns] cali472f9b2c0c8 [{dns UDP 53 0 } {dns-tcp TCP 53 0 } {metrics TCP 9153 0 } {liveness-probe TCP 8080 0 } {readiness-probe TCP 8181 0 }] [] }} ContainerID="dc8642c057c56a6b460b70b7883028171fef4639a548ccbcb1c066234e51427e" Namespace="kube-system" Pod="coredns-66bc5c9577-xpvnz" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-coredns--66bc5c9577--xpvnz-" Jan 16 17:59:05.947198 containerd[1547]: 2026-01-16 17:59:05.801 [INFO][4616] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="dc8642c057c56a6b460b70b7883028171fef4639a548ccbcb1c066234e51427e" Namespace="kube-system" Pod="coredns-66bc5c9577-xpvnz" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-coredns--66bc5c9577--xpvnz-eth0" Jan 16 17:59:05.947198 containerd[1547]: 2026-01-16 17:59:05.834 [INFO][4629] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="dc8642c057c56a6b460b70b7883028171fef4639a548ccbcb1c066234e51427e" HandleID="k8s-pod-network.dc8642c057c56a6b460b70b7883028171fef4639a548ccbcb1c066234e51427e" Workload="ci--4580--0--0--p--03fd9ab712-k8s-coredns--66bc5c9577--xpvnz-eth0" Jan 16 17:59:05.947198 containerd[1547]: 2026-01-16 17:59:05.835 [INFO][4629] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="dc8642c057c56a6b460b70b7883028171fef4639a548ccbcb1c066234e51427e" HandleID="k8s-pod-network.dc8642c057c56a6b460b70b7883028171fef4639a548ccbcb1c066234e51427e" Workload="ci--4580--0--0--p--03fd9ab712-k8s-coredns--66bc5c9577--xpvnz-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b0d0), Attrs:map[string]string{"namespace":"kube-system", "node":"ci-4580-0-0-p-03fd9ab712", "pod":"coredns-66bc5c9577-xpvnz", "timestamp":"2026-01-16 17:59:05.834971048 +0000 UTC"}, Hostname:"ci-4580-0-0-p-03fd9ab712", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 16 17:59:05.947198 containerd[1547]: 2026-01-16 17:59:05.835 [INFO][4629] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 16 17:59:05.947198 containerd[1547]: 2026-01-16 17:59:05.835 [INFO][4629] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 16 17:59:05.947198 containerd[1547]: 2026-01-16 17:59:05.835 [INFO][4629] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-03fd9ab712' Jan 16 17:59:05.947198 containerd[1547]: 2026-01-16 17:59:05.850 [INFO][4629] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.dc8642c057c56a6b460b70b7883028171fef4639a548ccbcb1c066234e51427e" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:05.947198 containerd[1547]: 2026-01-16 17:59:05.862 [INFO][4629] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:05.947198 containerd[1547]: 2026-01-16 17:59:05.867 [INFO][4629] ipam/ipam.go 511: Trying affinity for 192.168.125.0/26 host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:05.947198 containerd[1547]: 2026-01-16 17:59:05.869 [INFO][4629] ipam/ipam.go 158: Attempting to load block cidr=192.168.125.0/26 host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:05.947198 containerd[1547]: 2026-01-16 17:59:05.876 [INFO][4629] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.125.0/26 host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:05.947198 containerd[1547]: 2026-01-16 17:59:05.876 [INFO][4629] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.125.0/26 handle="k8s-pod-network.dc8642c057c56a6b460b70b7883028171fef4639a548ccbcb1c066234e51427e" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:05.947198 containerd[1547]: 2026-01-16 17:59:05.880 [INFO][4629] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.dc8642c057c56a6b460b70b7883028171fef4639a548ccbcb1c066234e51427e Jan 16 17:59:05.947198 containerd[1547]: 2026-01-16 17:59:05.889 [INFO][4629] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.125.0/26 handle="k8s-pod-network.dc8642c057c56a6b460b70b7883028171fef4639a548ccbcb1c066234e51427e" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:05.947198 containerd[1547]: 2026-01-16 17:59:05.903 [INFO][4629] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.125.6/26] block=192.168.125.0/26 handle="k8s-pod-network.dc8642c057c56a6b460b70b7883028171fef4639a548ccbcb1c066234e51427e" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:05.947198 containerd[1547]: 2026-01-16 17:59:05.903 [INFO][4629] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.125.6/26] handle="k8s-pod-network.dc8642c057c56a6b460b70b7883028171fef4639a548ccbcb1c066234e51427e" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:05.947198 containerd[1547]: 2026-01-16 17:59:05.903 [INFO][4629] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 16 17:59:05.947198 containerd[1547]: 2026-01-16 17:59:05.903 [INFO][4629] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.125.6/26] IPv6=[] ContainerID="dc8642c057c56a6b460b70b7883028171fef4639a548ccbcb1c066234e51427e" HandleID="k8s-pod-network.dc8642c057c56a6b460b70b7883028171fef4639a548ccbcb1c066234e51427e" Workload="ci--4580--0--0--p--03fd9ab712-k8s-coredns--66bc5c9577--xpvnz-eth0" Jan 16 17:59:05.947962 containerd[1547]: 2026-01-16 17:59:05.907 [INFO][4616] cni-plugin/k8s.go 418: Populated endpoint ContainerID="dc8642c057c56a6b460b70b7883028171fef4639a548ccbcb1c066234e51427e" Namespace="kube-system" Pod="coredns-66bc5c9577-xpvnz" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-coredns--66bc5c9577--xpvnz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--03fd9ab712-k8s-coredns--66bc5c9577--xpvnz-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"42aeda7d-4fe5-4a6e-a477-419675cf7435", ResourceVersion:"844", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 17, 58, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-03fd9ab712", ContainerID:"", Pod:"coredns-66bc5c9577-xpvnz", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.125.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali472f9b2c0c8", MAC:"", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 17:59:05.947962 containerd[1547]: 2026-01-16 17:59:05.908 [INFO][4616] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.125.6/32] ContainerID="dc8642c057c56a6b460b70b7883028171fef4639a548ccbcb1c066234e51427e" Namespace="kube-system" Pod="coredns-66bc5c9577-xpvnz" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-coredns--66bc5c9577--xpvnz-eth0" Jan 16 17:59:05.947962 containerd[1547]: 2026-01-16 17:59:05.910 [INFO][4616] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali472f9b2c0c8 ContainerID="dc8642c057c56a6b460b70b7883028171fef4639a548ccbcb1c066234e51427e" Namespace="kube-system" Pod="coredns-66bc5c9577-xpvnz" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-coredns--66bc5c9577--xpvnz-eth0" Jan 16 17:59:05.947962 containerd[1547]: 2026-01-16 17:59:05.923 [INFO][4616] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="dc8642c057c56a6b460b70b7883028171fef4639a548ccbcb1c066234e51427e" Namespace="kube-system" Pod="coredns-66bc5c9577-xpvnz" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-coredns--66bc5c9577--xpvnz-eth0" Jan 16 17:59:05.947962 containerd[1547]: 2026-01-16 17:59:05.924 [INFO][4616] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="dc8642c057c56a6b460b70b7883028171fef4639a548ccbcb1c066234e51427e" Namespace="kube-system" Pod="coredns-66bc5c9577-xpvnz" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-coredns--66bc5c9577--xpvnz-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--03fd9ab712-k8s-coredns--66bc5c9577--xpvnz-eth0", GenerateName:"coredns-66bc5c9577-", Namespace:"kube-system", SelfLink:"", UID:"42aeda7d-4fe5-4a6e-a477-419675cf7435", ResourceVersion:"844", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 17, 58, 14, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"k8s-app":"kube-dns", "pod-template-hash":"66bc5c9577", "projectcalico.org/namespace":"kube-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"coredns"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-03fd9ab712", ContainerID:"dc8642c057c56a6b460b70b7883028171fef4639a548ccbcb1c066234e51427e", Pod:"coredns-66bc5c9577-xpvnz", Endpoint:"eth0", ServiceAccountName:"coredns", IPNetworks:[]string{"192.168.125.6/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.kube-system", "ksa.kube-system.coredns"}, InterfaceName:"cali472f9b2c0c8", MAC:"c6:86:3b:a7:b2:2c", Ports:[]v3.WorkloadEndpointPort{v3.WorkloadEndpointPort{Name:"dns", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"UDP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"dns-tcp", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x35, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"metrics", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x23c1, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"liveness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1f90, HostPort:0x0, HostIP:""}, v3.WorkloadEndpointPort{Name:"readiness-probe", Protocol:numorstring.Protocol{Type:1, NumVal:0x0, StrVal:"TCP"}, Port:0x1ff5, HostPort:0x0, HostIP:""}}, AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 17:59:05.948129 containerd[1547]: 2026-01-16 17:59:05.940 [INFO][4616] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="dc8642c057c56a6b460b70b7883028171fef4639a548ccbcb1c066234e51427e" Namespace="kube-system" Pod="coredns-66bc5c9577-xpvnz" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-coredns--66bc5c9577--xpvnz-eth0" Jan 16 17:59:05.977785 containerd[1547]: time="2026-01-16T17:59:05.977645341Z" level=info msg="connecting to shim dc8642c057c56a6b460b70b7883028171fef4639a548ccbcb1c066234e51427e" address="unix:///run/containerd/s/27263aec7e6e9357ec0831cd4bbcf28bd4620db07d868e1d9c6fb085c584063b" namespace=k8s.io protocol=ttrpc version=3 Jan 16 17:59:06.000000 audit[4658]: NETFILTER_CFG table=filter:133 family=2 entries=44 op=nft_register_chain pid=4658 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 17:59:06.000000 audit[4658]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=21516 a0=3 a1=ffffca55bb30 a2=0 a3=ffff8a15ffa8 items=0 ppid=4060 pid=4658 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:06.000000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 17:59:06.027404 kubelet[2825]: E0116 17:59:06.027108 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-xk99t" podUID="d584925c-30a0-4760-b407-e5a8a40d9f3d" Jan 16 17:59:06.035412 kubelet[2825]: E0116 17:59:06.034133 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-k622k" podUID="e2b5c1b0-147f-4d63-9d4a-0d66468ad158" Jan 16 17:59:06.036927 kubelet[2825]: E0116 17:59:06.036810 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lqf7c" podUID="8f14bccb-f353-467f-b549-674ae9114a0e" Jan 16 17:59:06.044844 systemd[1]: Started cri-containerd-dc8642c057c56a6b460b70b7883028171fef4639a548ccbcb1c066234e51427e.scope - libcontainer container dc8642c057c56a6b460b70b7883028171fef4639a548ccbcb1c066234e51427e. Jan 16 17:59:06.077000 audit: BPF prog-id=234 op=LOAD Jan 16 17:59:06.078000 audit: BPF prog-id=235 op=LOAD Jan 16 17:59:06.078000 audit[4666]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8180 a2=98 a3=0 items=0 ppid=4653 pid=4666 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:06.078000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6463383634326330353763353661366234363062373062373838333032 Jan 16 17:59:06.078000 audit: BPF prog-id=235 op=UNLOAD Jan 16 17:59:06.078000 audit[4666]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4653 pid=4666 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:06.078000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6463383634326330353763353661366234363062373062373838333032 Jan 16 17:59:06.080000 audit: BPF prog-id=236 op=LOAD Jan 16 17:59:06.080000 audit[4666]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a83e8 a2=98 a3=0 items=0 ppid=4653 pid=4666 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:06.080000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6463383634326330353763353661366234363062373062373838333032 Jan 16 17:59:06.080000 audit: BPF prog-id=237 op=LOAD Jan 16 17:59:06.080000 audit[4666]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=40001a8168 a2=98 a3=0 items=0 ppid=4653 pid=4666 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:06.080000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6463383634326330353763353661366234363062373062373838333032 Jan 16 17:59:06.080000 audit: BPF prog-id=237 op=UNLOAD Jan 16 17:59:06.080000 audit[4666]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4653 pid=4666 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:06.080000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6463383634326330353763353661366234363062373062373838333032 Jan 16 17:59:06.080000 audit: BPF prog-id=236 op=UNLOAD Jan 16 17:59:06.080000 audit[4666]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4653 pid=4666 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:06.080000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6463383634326330353763353661366234363062373062373838333032 Jan 16 17:59:06.080000 audit: BPF prog-id=238 op=LOAD Jan 16 17:59:06.080000 audit[4666]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001a8648 a2=98 a3=0 items=0 ppid=4653 pid=4666 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:06.080000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6463383634326330353763353661366234363062373062373838333032 Jan 16 17:59:06.130146 containerd[1547]: time="2026-01-16T17:59:06.129484615Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:coredns-66bc5c9577-xpvnz,Uid:42aeda7d-4fe5-4a6e-a477-419675cf7435,Namespace:kube-system,Attempt:0,} returns sandbox id \"dc8642c057c56a6b460b70b7883028171fef4639a548ccbcb1c066234e51427e\"" Jan 16 17:59:06.138071 containerd[1547]: time="2026-01-16T17:59:06.138019290Z" level=info msg="CreateContainer within sandbox \"dc8642c057c56a6b460b70b7883028171fef4639a548ccbcb1c066234e51427e\" for container &ContainerMetadata{Name:coredns,Attempt:0,}" Jan 16 17:59:06.152000 audit[4693]: NETFILTER_CFG table=filter:134 family=2 entries=14 op=nft_register_rule pid=4693 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:59:06.152000 audit[4693]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffdcd0c490 a2=0 a3=1 items=0 ppid=2976 pid=4693 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:06.152000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:59:06.156131 containerd[1547]: time="2026-01-16T17:59:06.155978881Z" level=info msg="Container b95c01743b348f44e3c1e05f403aa6ee2a6f27c6425ce44c1336cc0313abb954: CDI devices from CRI Config.CDIDevices: []" Jan 16 17:59:06.158641 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount3476214884.mount: Deactivated successfully. Jan 16 17:59:06.157000 audit[4693]: NETFILTER_CFG table=nat:135 family=2 entries=20 op=nft_register_rule pid=4693 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:59:06.157000 audit[4693]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffdcd0c490 a2=0 a3=1 items=0 ppid=2976 pid=4693 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:06.157000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:59:06.167529 containerd[1547]: time="2026-01-16T17:59:06.167464732Z" level=info msg="CreateContainer within sandbox \"dc8642c057c56a6b460b70b7883028171fef4639a548ccbcb1c066234e51427e\" for &ContainerMetadata{Name:coredns,Attempt:0,} returns container id \"b95c01743b348f44e3c1e05f403aa6ee2a6f27c6425ce44c1336cc0313abb954\"" Jan 16 17:59:06.168422 containerd[1547]: time="2026-01-16T17:59:06.168371534Z" level=info msg="StartContainer for \"b95c01743b348f44e3c1e05f403aa6ee2a6f27c6425ce44c1336cc0313abb954\"" Jan 16 17:59:06.169751 containerd[1547]: time="2026-01-16T17:59:06.169716676Z" level=info msg="connecting to shim b95c01743b348f44e3c1e05f403aa6ee2a6f27c6425ce44c1336cc0313abb954" address="unix:///run/containerd/s/27263aec7e6e9357ec0831cd4bbcf28bd4620db07d868e1d9c6fb085c584063b" protocol=ttrpc version=3 Jan 16 17:59:06.192063 systemd[1]: Started cri-containerd-b95c01743b348f44e3c1e05f403aa6ee2a6f27c6425ce44c1336cc0313abb954.scope - libcontainer container b95c01743b348f44e3c1e05f403aa6ee2a6f27c6425ce44c1336cc0313abb954. Jan 16 17:59:06.206000 audit: BPF prog-id=239 op=LOAD Jan 16 17:59:06.207000 audit: BPF prog-id=240 op=LOAD Jan 16 17:59:06.207000 audit[4694]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130180 a2=98 a3=0 items=0 ppid=4653 pid=4694 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:06.207000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239356330313734336233343866343465336331653035663430336161 Jan 16 17:59:06.207000 audit: BPF prog-id=240 op=UNLOAD Jan 16 17:59:06.207000 audit[4694]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4653 pid=4694 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:06.207000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239356330313734336233343866343465336331653035663430336161 Jan 16 17:59:06.207000 audit: BPF prog-id=241 op=LOAD Jan 16 17:59:06.207000 audit[4694]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001303e8 a2=98 a3=0 items=0 ppid=4653 pid=4694 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:06.207000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239356330313734336233343866343465336331653035663430336161 Jan 16 17:59:06.207000 audit: BPF prog-id=242 op=LOAD Jan 16 17:59:06.207000 audit[4694]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000130168 a2=98 a3=0 items=0 ppid=4653 pid=4694 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:06.207000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239356330313734336233343866343465336331653035663430336161 Jan 16 17:59:06.207000 audit: BPF prog-id=242 op=UNLOAD Jan 16 17:59:06.207000 audit[4694]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4653 pid=4694 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:06.207000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239356330313734336233343866343465336331653035663430336161 Jan 16 17:59:06.207000 audit: BPF prog-id=241 op=UNLOAD Jan 16 17:59:06.207000 audit[4694]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4653 pid=4694 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:06.207000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239356330313734336233343866343465336331653035663430336161 Jan 16 17:59:06.207000 audit: BPF prog-id=243 op=LOAD Jan 16 17:59:06.207000 audit[4694]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000130648 a2=98 a3=0 items=0 ppid=4653 pid=4694 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:06.207000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6239356330313734336233343866343465336331653035663430336161 Jan 16 17:59:06.230856 containerd[1547]: time="2026-01-16T17:59:06.230805099Z" level=info msg="StartContainer for \"b95c01743b348f44e3c1e05f403aa6ee2a6f27c6425ce44c1336cc0313abb954\" returns successfully" Jan 16 17:59:06.586806 systemd-networkd[1470]: cali933c0f7c4ba: Gained IPv6LL Jan 16 17:59:06.750939 containerd[1547]: time="2026-01-16T17:59:06.749955037Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-pqdht,Uid:2eb1812e-29cb-4b14-8060-fe2f9a701433,Namespace:calico-system,Attempt:0,}" Jan 16 17:59:06.753216 containerd[1547]: time="2026-01-16T17:59:06.753173398Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-94bfc95c-f6gcl,Uid:c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a,Namespace:calico-system,Attempt:0,}" Jan 16 17:59:06.779825 systemd-networkd[1470]: cali527513e87aa: Gained IPv6LL Jan 16 17:59:06.933751 systemd-networkd[1470]: cali50a16c933b7: Link UP Jan 16 17:59:06.936190 systemd-networkd[1470]: cali50a16c933b7: Gained carrier Jan 16 17:59:06.973938 containerd[1547]: 2026-01-16 17:59:06.807 [INFO][4727] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--03fd9ab712-k8s-calico--kube--controllers--94bfc95c--f6gcl-eth0 calico-kube-controllers-94bfc95c- calico-system c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a 841 0 2026-01-16 17:58:40 +0000 UTC map[app.kubernetes.io/name:calico-kube-controllers k8s-app:calico-kube-controllers pod-template-hash:94bfc95c projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:calico-kube-controllers] map[] [] [] []} {k8s ci-4580-0-0-p-03fd9ab712 calico-kube-controllers-94bfc95c-f6gcl eth0 calico-kube-controllers [] [] [kns.calico-system ksa.calico-system.calico-kube-controllers] cali50a16c933b7 [] [] }} ContainerID="09cfe15cc581d1f7c3e64ee9fcb3be220b3346333647f6a6c7e5b8f2ea7e372b" Namespace="calico-system" Pod="calico-kube-controllers-94bfc95c-f6gcl" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-calico--kube--controllers--94bfc95c--f6gcl-" Jan 16 17:59:06.973938 containerd[1547]: 2026-01-16 17:59:06.808 [INFO][4727] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="09cfe15cc581d1f7c3e64ee9fcb3be220b3346333647f6a6c7e5b8f2ea7e372b" Namespace="calico-system" Pod="calico-kube-controllers-94bfc95c-f6gcl" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-calico--kube--controllers--94bfc95c--f6gcl-eth0" Jan 16 17:59:06.973938 containerd[1547]: 2026-01-16 17:59:06.850 [INFO][4752] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="09cfe15cc581d1f7c3e64ee9fcb3be220b3346333647f6a6c7e5b8f2ea7e372b" HandleID="k8s-pod-network.09cfe15cc581d1f7c3e64ee9fcb3be220b3346333647f6a6c7e5b8f2ea7e372b" Workload="ci--4580--0--0--p--03fd9ab712-k8s-calico--kube--controllers--94bfc95c--f6gcl-eth0" Jan 16 17:59:06.973938 containerd[1547]: 2026-01-16 17:59:06.850 [INFO][4752] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="09cfe15cc581d1f7c3e64ee9fcb3be220b3346333647f6a6c7e5b8f2ea7e372b" HandleID="k8s-pod-network.09cfe15cc581d1f7c3e64ee9fcb3be220b3346333647f6a6c7e5b8f2ea7e372b" Workload="ci--4580--0--0--p--03fd9ab712-k8s-calico--kube--controllers--94bfc95c--f6gcl-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400024b680), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4580-0-0-p-03fd9ab712", "pod":"calico-kube-controllers-94bfc95c-f6gcl", "timestamp":"2026-01-16 17:59:06.850164503 +0000 UTC"}, Hostname:"ci-4580-0-0-p-03fd9ab712", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 16 17:59:06.973938 containerd[1547]: 2026-01-16 17:59:06.850 [INFO][4752] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 16 17:59:06.973938 containerd[1547]: 2026-01-16 17:59:06.850 [INFO][4752] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 16 17:59:06.973938 containerd[1547]: 2026-01-16 17:59:06.850 [INFO][4752] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-03fd9ab712' Jan 16 17:59:06.973938 containerd[1547]: 2026-01-16 17:59:06.873 [INFO][4752] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.09cfe15cc581d1f7c3e64ee9fcb3be220b3346333647f6a6c7e5b8f2ea7e372b" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:06.973938 containerd[1547]: 2026-01-16 17:59:06.882 [INFO][4752] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:06.973938 containerd[1547]: 2026-01-16 17:59:06.889 [INFO][4752] ipam/ipam.go 511: Trying affinity for 192.168.125.0/26 host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:06.973938 containerd[1547]: 2026-01-16 17:59:06.892 [INFO][4752] ipam/ipam.go 158: Attempting to load block cidr=192.168.125.0/26 host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:06.973938 containerd[1547]: 2026-01-16 17:59:06.897 [INFO][4752] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.125.0/26 host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:06.973938 containerd[1547]: 2026-01-16 17:59:06.897 [INFO][4752] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.125.0/26 handle="k8s-pod-network.09cfe15cc581d1f7c3e64ee9fcb3be220b3346333647f6a6c7e5b8f2ea7e372b" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:06.973938 containerd[1547]: 2026-01-16 17:59:06.899 [INFO][4752] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.09cfe15cc581d1f7c3e64ee9fcb3be220b3346333647f6a6c7e5b8f2ea7e372b Jan 16 17:59:06.973938 containerd[1547]: 2026-01-16 17:59:06.906 [INFO][4752] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.125.0/26 handle="k8s-pod-network.09cfe15cc581d1f7c3e64ee9fcb3be220b3346333647f6a6c7e5b8f2ea7e372b" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:06.973938 containerd[1547]: 2026-01-16 17:59:06.922 [INFO][4752] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.125.7/26] block=192.168.125.0/26 handle="k8s-pod-network.09cfe15cc581d1f7c3e64ee9fcb3be220b3346333647f6a6c7e5b8f2ea7e372b" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:06.973938 containerd[1547]: 2026-01-16 17:59:06.922 [INFO][4752] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.125.7/26] handle="k8s-pod-network.09cfe15cc581d1f7c3e64ee9fcb3be220b3346333647f6a6c7e5b8f2ea7e372b" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:06.973938 containerd[1547]: 2026-01-16 17:59:06.922 [INFO][4752] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 16 17:59:06.973938 containerd[1547]: 2026-01-16 17:59:06.922 [INFO][4752] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.125.7/26] IPv6=[] ContainerID="09cfe15cc581d1f7c3e64ee9fcb3be220b3346333647f6a6c7e5b8f2ea7e372b" HandleID="k8s-pod-network.09cfe15cc581d1f7c3e64ee9fcb3be220b3346333647f6a6c7e5b8f2ea7e372b" Workload="ci--4580--0--0--p--03fd9ab712-k8s-calico--kube--controllers--94bfc95c--f6gcl-eth0" Jan 16 17:59:06.975739 containerd[1547]: 2026-01-16 17:59:06.928 [INFO][4727] cni-plugin/k8s.go 418: Populated endpoint ContainerID="09cfe15cc581d1f7c3e64ee9fcb3be220b3346333647f6a6c7e5b8f2ea7e372b" Namespace="calico-system" Pod="calico-kube-controllers-94bfc95c-f6gcl" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-calico--kube--controllers--94bfc95c--f6gcl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--03fd9ab712-k8s-calico--kube--controllers--94bfc95c--f6gcl-eth0", GenerateName:"calico-kube-controllers-94bfc95c-", Namespace:"calico-system", SelfLink:"", UID:"c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 17, 58, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"94bfc95c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-03fd9ab712", ContainerID:"", Pod:"calico-kube-controllers-94bfc95c-f6gcl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.125.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali50a16c933b7", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 17:59:06.975739 containerd[1547]: 2026-01-16 17:59:06.928 [INFO][4727] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.125.7/32] ContainerID="09cfe15cc581d1f7c3e64ee9fcb3be220b3346333647f6a6c7e5b8f2ea7e372b" Namespace="calico-system" Pod="calico-kube-controllers-94bfc95c-f6gcl" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-calico--kube--controllers--94bfc95c--f6gcl-eth0" Jan 16 17:59:06.975739 containerd[1547]: 2026-01-16 17:59:06.928 [INFO][4727] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to cali50a16c933b7 ContainerID="09cfe15cc581d1f7c3e64ee9fcb3be220b3346333647f6a6c7e5b8f2ea7e372b" Namespace="calico-system" Pod="calico-kube-controllers-94bfc95c-f6gcl" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-calico--kube--controllers--94bfc95c--f6gcl-eth0" Jan 16 17:59:06.975739 containerd[1547]: 2026-01-16 17:59:06.935 [INFO][4727] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="09cfe15cc581d1f7c3e64ee9fcb3be220b3346333647f6a6c7e5b8f2ea7e372b" Namespace="calico-system" Pod="calico-kube-controllers-94bfc95c-f6gcl" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-calico--kube--controllers--94bfc95c--f6gcl-eth0" Jan 16 17:59:06.975739 containerd[1547]: 2026-01-16 17:59:06.936 [INFO][4727] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="09cfe15cc581d1f7c3e64ee9fcb3be220b3346333647f6a6c7e5b8f2ea7e372b" Namespace="calico-system" Pod="calico-kube-controllers-94bfc95c-f6gcl" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-calico--kube--controllers--94bfc95c--f6gcl-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--03fd9ab712-k8s-calico--kube--controllers--94bfc95c--f6gcl-eth0", GenerateName:"calico-kube-controllers-94bfc95c-", Namespace:"calico-system", SelfLink:"", UID:"c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a", ResourceVersion:"841", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 17, 58, 40, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"calico-kube-controllers", "k8s-app":"calico-kube-controllers", "pod-template-hash":"94bfc95c", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"calico-kube-controllers"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-03fd9ab712", ContainerID:"09cfe15cc581d1f7c3e64ee9fcb3be220b3346333647f6a6c7e5b8f2ea7e372b", Pod:"calico-kube-controllers-94bfc95c-f6gcl", Endpoint:"eth0", ServiceAccountName:"calico-kube-controllers", IPNetworks:[]string{"192.168.125.7/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.calico-kube-controllers"}, InterfaceName:"cali50a16c933b7", MAC:"92:55:3a:e2:42:36", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 17:59:06.975739 containerd[1547]: 2026-01-16 17:59:06.965 [INFO][4727] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="09cfe15cc581d1f7c3e64ee9fcb3be220b3346333647f6a6c7e5b8f2ea7e372b" Namespace="calico-system" Pod="calico-kube-controllers-94bfc95c-f6gcl" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-calico--kube--controllers--94bfc95c--f6gcl-eth0" Jan 16 17:59:07.020831 containerd[1547]: time="2026-01-16T17:59:07.020770111Z" level=info msg="connecting to shim 09cfe15cc581d1f7c3e64ee9fcb3be220b3346333647f6a6c7e5b8f2ea7e372b" address="unix:///run/containerd/s/89b79c69d5d1562fe8293cee70167940f181710d20a137c5b22c62d5efa55594" namespace=k8s.io protocol=ttrpc version=3 Jan 16 17:59:07.025000 audit[4784]: NETFILTER_CFG table=filter:136 family=2 entries=40 op=nft_register_chain pid=4784 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 17:59:07.025000 audit[4784]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=20784 a0=3 a1=ffffd42bd550 a2=0 a3=ffffadcedfa8 items=0 ppid=4060 pid=4784 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:07.025000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 17:59:07.048410 kubelet[2825]: E0116 17:59:07.048105 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-k622k" podUID="e2b5c1b0-147f-4d63-9d4a-0d66468ad158" Jan 16 17:59:07.048855 kubelet[2825]: E0116 17:59:07.048566 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-xk99t" podUID="d584925c-30a0-4760-b407-e5a8a40d9f3d" Jan 16 17:59:07.085122 systemd-networkd[1470]: calia9058c16c76: Link UP Jan 16 17:59:07.086431 systemd-networkd[1470]: calia9058c16c76: Gained carrier Jan 16 17:59:07.114932 systemd[1]: Started cri-containerd-09cfe15cc581d1f7c3e64ee9fcb3be220b3346333647f6a6c7e5b8f2ea7e372b.scope - libcontainer container 09cfe15cc581d1f7c3e64ee9fcb3be220b3346333647f6a6c7e5b8f2ea7e372b. Jan 16 17:59:07.120140 containerd[1547]: 2026-01-16 17:59:06.824 [INFO][4734] cni-plugin/plugin.go 340: Calico CNI found existing endpoint: &{{WorkloadEndpoint projectcalico.org/v3} {ci--4580--0--0--p--03fd9ab712-k8s-goldmane--7c778bb748--pqdht-eth0 goldmane-7c778bb748- calico-system 2eb1812e-29cb-4b14-8060-fe2f9a701433 840 0 2026-01-16 17:58:37 +0000 UTC map[app.kubernetes.io/name:goldmane k8s-app:goldmane pod-template-hash:7c778bb748 projectcalico.org/namespace:calico-system projectcalico.org/orchestrator:k8s projectcalico.org/serviceaccount:goldmane] map[] [] [] []} {k8s ci-4580-0-0-p-03fd9ab712 goldmane-7c778bb748-pqdht eth0 goldmane [] [] [kns.calico-system ksa.calico-system.goldmane] calia9058c16c76 [] [] }} ContainerID="fae0d5ad7e181cb8f73953c4da0c3f4e51022cdb1f606755cfc01f46ae39ecc1" Namespace="calico-system" Pod="goldmane-7c778bb748-pqdht" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-goldmane--7c778bb748--pqdht-" Jan 16 17:59:07.120140 containerd[1547]: 2026-01-16 17:59:06.825 [INFO][4734] cni-plugin/k8s.go 74: Extracted identifiers for CmdAddK8s ContainerID="fae0d5ad7e181cb8f73953c4da0c3f4e51022cdb1f606755cfc01f46ae39ecc1" Namespace="calico-system" Pod="goldmane-7c778bb748-pqdht" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-goldmane--7c778bb748--pqdht-eth0" Jan 16 17:59:07.120140 containerd[1547]: 2026-01-16 17:59:06.881 [INFO][4757] ipam/ipam_plugin.go 227: Calico CNI IPAM request count IPv4=1 IPv6=0 ContainerID="fae0d5ad7e181cb8f73953c4da0c3f4e51022cdb1f606755cfc01f46ae39ecc1" HandleID="k8s-pod-network.fae0d5ad7e181cb8f73953c4da0c3f4e51022cdb1f606755cfc01f46ae39ecc1" Workload="ci--4580--0--0--p--03fd9ab712-k8s-goldmane--7c778bb748--pqdht-eth0" Jan 16 17:59:07.120140 containerd[1547]: 2026-01-16 17:59:06.881 [INFO][4757] ipam/ipam_plugin.go 275: Auto assigning IP ContainerID="fae0d5ad7e181cb8f73953c4da0c3f4e51022cdb1f606755cfc01f46ae39ecc1" HandleID="k8s-pod-network.fae0d5ad7e181cb8f73953c4da0c3f4e51022cdb1f606755cfc01f46ae39ecc1" Workload="ci--4580--0--0--p--03fd9ab712-k8s-goldmane--7c778bb748--pqdht-eth0" assignArgs=ipam.AutoAssignArgs{Num4:1, Num6:0, HandleID:(*string)(0x400004d600), Attrs:map[string]string{"namespace":"calico-system", "node":"ci-4580-0-0-p-03fd9ab712", "pod":"goldmane-7c778bb748-pqdht", "timestamp":"2026-01-16 17:59:06.881352282 +0000 UTC"}, Hostname:"ci-4580-0-0-p-03fd9ab712", IPv4Pools:[]net.IPNet{}, IPv6Pools:[]net.IPNet{}, MaxBlocksPerHost:0, HostReservedAttrIPv4s:(*ipam.HostReservedAttr)(nil), HostReservedAttrIPv6s:(*ipam.HostReservedAttr)(nil), IntendedUse:"Workload"} Jan 16 17:59:07.120140 containerd[1547]: 2026-01-16 17:59:06.881 [INFO][4757] ipam/ipam_plugin.go 377: About to acquire host-wide IPAM lock. Jan 16 17:59:07.120140 containerd[1547]: 2026-01-16 17:59:06.922 [INFO][4757] ipam/ipam_plugin.go 392: Acquired host-wide IPAM lock. Jan 16 17:59:07.120140 containerd[1547]: 2026-01-16 17:59:06.922 [INFO][4757] ipam/ipam.go 110: Auto-assign 1 ipv4, 0 ipv6 addrs for host 'ci-4580-0-0-p-03fd9ab712' Jan 16 17:59:07.120140 containerd[1547]: 2026-01-16 17:59:06.978 [INFO][4757] ipam/ipam.go 691: Looking up existing affinities for host handle="k8s-pod-network.fae0d5ad7e181cb8f73953c4da0c3f4e51022cdb1f606755cfc01f46ae39ecc1" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:07.120140 containerd[1547]: 2026-01-16 17:59:06.988 [INFO][4757] ipam/ipam.go 394: Looking up existing affinities for host host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:07.120140 containerd[1547]: 2026-01-16 17:59:06.999 [INFO][4757] ipam/ipam.go 511: Trying affinity for 192.168.125.0/26 host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:07.120140 containerd[1547]: 2026-01-16 17:59:07.004 [INFO][4757] ipam/ipam.go 158: Attempting to load block cidr=192.168.125.0/26 host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:07.120140 containerd[1547]: 2026-01-16 17:59:07.015 [INFO][4757] ipam/ipam.go 235: Affinity is confirmed and block has been loaded cidr=192.168.125.0/26 host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:07.120140 containerd[1547]: 2026-01-16 17:59:07.016 [INFO][4757] ipam/ipam.go 1219: Attempting to assign 1 addresses from block block=192.168.125.0/26 handle="k8s-pod-network.fae0d5ad7e181cb8f73953c4da0c3f4e51022cdb1f606755cfc01f46ae39ecc1" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:07.120140 containerd[1547]: 2026-01-16 17:59:07.018 [INFO][4757] ipam/ipam.go 1780: Creating new handle: k8s-pod-network.fae0d5ad7e181cb8f73953c4da0c3f4e51022cdb1f606755cfc01f46ae39ecc1 Jan 16 17:59:07.120140 containerd[1547]: 2026-01-16 17:59:07.035 [INFO][4757] ipam/ipam.go 1246: Writing block in order to claim IPs block=192.168.125.0/26 handle="k8s-pod-network.fae0d5ad7e181cb8f73953c4da0c3f4e51022cdb1f606755cfc01f46ae39ecc1" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:07.120140 containerd[1547]: 2026-01-16 17:59:07.070 [INFO][4757] ipam/ipam.go 1262: Successfully claimed IPs: [192.168.125.8/26] block=192.168.125.0/26 handle="k8s-pod-network.fae0d5ad7e181cb8f73953c4da0c3f4e51022cdb1f606755cfc01f46ae39ecc1" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:07.120140 containerd[1547]: 2026-01-16 17:59:07.070 [INFO][4757] ipam/ipam.go 878: Auto-assigned 1 out of 1 IPv4s: [192.168.125.8/26] handle="k8s-pod-network.fae0d5ad7e181cb8f73953c4da0c3f4e51022cdb1f606755cfc01f46ae39ecc1" host="ci-4580-0-0-p-03fd9ab712" Jan 16 17:59:07.120140 containerd[1547]: 2026-01-16 17:59:07.070 [INFO][4757] ipam/ipam_plugin.go 398: Released host-wide IPAM lock. Jan 16 17:59:07.120140 containerd[1547]: 2026-01-16 17:59:07.070 [INFO][4757] ipam/ipam_plugin.go 299: Calico CNI IPAM assigned addresses IPv4=[192.168.125.8/26] IPv6=[] ContainerID="fae0d5ad7e181cb8f73953c4da0c3f4e51022cdb1f606755cfc01f46ae39ecc1" HandleID="k8s-pod-network.fae0d5ad7e181cb8f73953c4da0c3f4e51022cdb1f606755cfc01f46ae39ecc1" Workload="ci--4580--0--0--p--03fd9ab712-k8s-goldmane--7c778bb748--pqdht-eth0" Jan 16 17:59:07.120730 containerd[1547]: 2026-01-16 17:59:07.077 [INFO][4734] cni-plugin/k8s.go 418: Populated endpoint ContainerID="fae0d5ad7e181cb8f73953c4da0c3f4e51022cdb1f606755cfc01f46ae39ecc1" Namespace="calico-system" Pod="goldmane-7c778bb748-pqdht" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-goldmane--7c778bb748--pqdht-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--03fd9ab712-k8s-goldmane--7c778bb748--pqdht-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"2eb1812e-29cb-4b14-8060-fe2f9a701433", ResourceVersion:"840", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 17, 58, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-03fd9ab712", ContainerID:"", Pod:"goldmane-7c778bb748-pqdht", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.125.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia9058c16c76", MAC:"", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 17:59:07.120730 containerd[1547]: 2026-01-16 17:59:07.077 [INFO][4734] cni-plugin/k8s.go 419: Calico CNI using IPs: [192.168.125.8/32] ContainerID="fae0d5ad7e181cb8f73953c4da0c3f4e51022cdb1f606755cfc01f46ae39ecc1" Namespace="calico-system" Pod="goldmane-7c778bb748-pqdht" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-goldmane--7c778bb748--pqdht-eth0" Jan 16 17:59:07.120730 containerd[1547]: 2026-01-16 17:59:07.077 [INFO][4734] cni-plugin/dataplane_linux.go 69: Setting the host side veth name to calia9058c16c76 ContainerID="fae0d5ad7e181cb8f73953c4da0c3f4e51022cdb1f606755cfc01f46ae39ecc1" Namespace="calico-system" Pod="goldmane-7c778bb748-pqdht" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-goldmane--7c778bb748--pqdht-eth0" Jan 16 17:59:07.120730 containerd[1547]: 2026-01-16 17:59:07.086 [INFO][4734] cni-plugin/dataplane_linux.go 508: Disabling IPv4 forwarding ContainerID="fae0d5ad7e181cb8f73953c4da0c3f4e51022cdb1f606755cfc01f46ae39ecc1" Namespace="calico-system" Pod="goldmane-7c778bb748-pqdht" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-goldmane--7c778bb748--pqdht-eth0" Jan 16 17:59:07.120730 containerd[1547]: 2026-01-16 17:59:07.087 [INFO][4734] cni-plugin/k8s.go 446: Added Mac, interface name, and active container ID to endpoint ContainerID="fae0d5ad7e181cb8f73953c4da0c3f4e51022cdb1f606755cfc01f46ae39ecc1" Namespace="calico-system" Pod="goldmane-7c778bb748-pqdht" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-goldmane--7c778bb748--pqdht-eth0" endpoint=&v3.WorkloadEndpoint{TypeMeta:v1.TypeMeta{Kind:"WorkloadEndpoint", APIVersion:"projectcalico.org/v3"}, ObjectMeta:v1.ObjectMeta{Name:"ci--4580--0--0--p--03fd9ab712-k8s-goldmane--7c778bb748--pqdht-eth0", GenerateName:"goldmane-7c778bb748-", Namespace:"calico-system", SelfLink:"", UID:"2eb1812e-29cb-4b14-8060-fe2f9a701433", ResourceVersion:"840", Generation:0, CreationTimestamp:time.Date(2026, time.January, 16, 17, 58, 37, 0, time.Local), DeletionTimestamp:, DeletionGracePeriodSeconds:(*int64)(nil), Labels:map[string]string{"app.kubernetes.io/name":"goldmane", "k8s-app":"goldmane", "pod-template-hash":"7c778bb748", "projectcalico.org/namespace":"calico-system", "projectcalico.org/orchestrator":"k8s", "projectcalico.org/serviceaccount":"goldmane"}, Annotations:map[string]string(nil), OwnerReferences:[]v1.OwnerReference(nil), Finalizers:[]string(nil), ManagedFields:[]v1.ManagedFieldsEntry(nil)}, Spec:v3.WorkloadEndpointSpec{Orchestrator:"k8s", Workload:"", Node:"ci-4580-0-0-p-03fd9ab712", ContainerID:"fae0d5ad7e181cb8f73953c4da0c3f4e51022cdb1f606755cfc01f46ae39ecc1", Pod:"goldmane-7c778bb748-pqdht", Endpoint:"eth0", ServiceAccountName:"goldmane", IPNetworks:[]string{"192.168.125.8/32"}, IPNATs:[]v3.IPNAT(nil), IPv4Gateway:"", IPv6Gateway:"", Profiles:[]string{"kns.calico-system", "ksa.calico-system.goldmane"}, InterfaceName:"calia9058c16c76", MAC:"52:7e:4f:cf:0c:43", Ports:[]v3.WorkloadEndpointPort(nil), AllowSpoofedSourcePrefixes:[]string(nil), QoSControls:(*v3.QoSControls)(nil)}} Jan 16 17:59:07.120730 containerd[1547]: 2026-01-16 17:59:07.111 [INFO][4734] cni-plugin/k8s.go 532: Wrote updated endpoint to datastore ContainerID="fae0d5ad7e181cb8f73953c4da0c3f4e51022cdb1f606755cfc01f46ae39ecc1" Namespace="calico-system" Pod="goldmane-7c778bb748-pqdht" WorkloadEndpoint="ci--4580--0--0--p--03fd9ab712-k8s-goldmane--7c778bb748--pqdht-eth0" Jan 16 17:59:07.123208 kubelet[2825]: I0116 17:59:07.122309 2825 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/coredns-66bc5c9577-xpvnz" podStartSLOduration=53.122293841 podStartE2EDuration="53.122293841s" podCreationTimestamp="2026-01-16 17:58:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-16 17:59:07.121909414 +0000 UTC m=+58.509825229" watchObservedRunningTime="2026-01-16 17:59:07.122293841 +0000 UTC m=+58.510209656" Jan 16 17:59:07.169985 containerd[1547]: time="2026-01-16T17:59:07.169834559Z" level=info msg="connecting to shim fae0d5ad7e181cb8f73953c4da0c3f4e51022cdb1f606755cfc01f46ae39ecc1" address="unix:///run/containerd/s/a4e18e517cd7409c157269be83b004c33a516b0026fe449025145806ad71dfbe" namespace=k8s.io protocol=ttrpc version=3 Jan 16 17:59:07.205000 audit: BPF prog-id=244 op=LOAD Jan 16 17:59:07.207000 audit: BPF prog-id=245 op=LOAD Jan 16 17:59:07.207000 audit[4797]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000220180 a2=98 a3=0 items=0 ppid=4786 pid=4797 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:07.207000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039636665313563633538316431663763336536346565396663623362 Jan 16 17:59:07.208000 audit: BPF prog-id=245 op=UNLOAD Jan 16 17:59:07.208000 audit[4797]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4786 pid=4797 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:07.208000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039636665313563633538316431663763336536346565396663623362 Jan 16 17:59:07.209000 audit: BPF prog-id=246 op=LOAD Jan 16 17:59:07.209000 audit[4797]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40002203e8 a2=98 a3=0 items=0 ppid=4786 pid=4797 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:07.209000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039636665313563633538316431663763336536346565396663623362 Jan 16 17:59:07.210000 audit: BPF prog-id=247 op=LOAD Jan 16 17:59:07.210000 audit[4797]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000220168 a2=98 a3=0 items=0 ppid=4786 pid=4797 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:07.210000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039636665313563633538316431663763336536346565396663623362 Jan 16 17:59:07.212000 audit: BPF prog-id=247 op=UNLOAD Jan 16 17:59:07.212000 audit[4797]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4786 pid=4797 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:07.212000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039636665313563633538316431663763336536346565396663623362 Jan 16 17:59:07.212000 audit: BPF prog-id=246 op=UNLOAD Jan 16 17:59:07.212000 audit[4797]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4786 pid=4797 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:07.212000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039636665313563633538316431663763336536346565396663623362 Jan 16 17:59:07.212000 audit: BPF prog-id=248 op=LOAD Jan 16 17:59:07.212000 audit[4797]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000220648 a2=98 a3=0 items=0 ppid=4786 pid=4797 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:07.212000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3039636665313563633538316431663763336536346565396663623362 Jan 16 17:59:07.242234 systemd[1]: Started cri-containerd-fae0d5ad7e181cb8f73953c4da0c3f4e51022cdb1f606755cfc01f46ae39ecc1.scope - libcontainer container fae0d5ad7e181cb8f73953c4da0c3f4e51022cdb1f606755cfc01f46ae39ecc1. Jan 16 17:59:07.252000 audit[4861]: NETFILTER_CFG table=filter:137 family=2 entries=14 op=nft_register_rule pid=4861 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:59:07.252000 audit[4861]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=fffffbc29490 a2=0 a3=1 items=0 ppid=2976 pid=4861 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:07.252000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:59:07.261000 audit[4861]: NETFILTER_CFG table=nat:138 family=2 entries=44 op=nft_register_rule pid=4861 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:59:07.261000 audit[4861]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14196 a0=3 a1=fffffbc29490 a2=0 a3=1 items=0 ppid=2976 pid=4861 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:07.261000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:59:07.290724 systemd-networkd[1470]: cali472f9b2c0c8: Gained IPv6LL Jan 16 17:59:07.308468 containerd[1547]: time="2026-01-16T17:59:07.308079390Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:calico-kube-controllers-94bfc95c-f6gcl,Uid:c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a,Namespace:calico-system,Attempt:0,} returns sandbox id \"09cfe15cc581d1f7c3e64ee9fcb3be220b3346333647f6a6c7e5b8f2ea7e372b\"" Jan 16 17:59:07.312788 containerd[1547]: time="2026-01-16T17:59:07.311082684Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 16 17:59:07.324000 audit: BPF prog-id=249 op=LOAD Jan 16 17:59:07.326000 audit: BPF prog-id=250 op=LOAD Jan 16 17:59:07.326000 audit[4845]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106180 a2=98 a3=0 items=0 ppid=4832 pid=4845 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:07.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661653064356164376531383163623866373339353363346461306333 Jan 16 17:59:07.326000 audit: BPF prog-id=250 op=UNLOAD Jan 16 17:59:07.326000 audit[4845]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4832 pid=4845 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:07.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661653064356164376531383163623866373339353363346461306333 Jan 16 17:59:07.326000 audit: BPF prog-id=251 op=LOAD Jan 16 17:59:07.326000 audit[4845]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001063e8 a2=98 a3=0 items=0 ppid=4832 pid=4845 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:07.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661653064356164376531383163623866373339353363346461306333 Jan 16 17:59:07.326000 audit: BPF prog-id=252 op=LOAD Jan 16 17:59:07.326000 audit[4845]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000106168 a2=98 a3=0 items=0 ppid=4832 pid=4845 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:07.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661653064356164376531383163623866373339353363346461306333 Jan 16 17:59:07.326000 audit: BPF prog-id=252 op=UNLOAD Jan 16 17:59:07.326000 audit[4845]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=4832 pid=4845 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:07.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661653064356164376531383163623866373339353363346461306333 Jan 16 17:59:07.326000 audit: BPF prog-id=251 op=UNLOAD Jan 16 17:59:07.326000 audit[4845]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=4832 pid=4845 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:07.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661653064356164376531383163623866373339353363346461306333 Jan 16 17:59:07.326000 audit: BPF prog-id=253 op=LOAD Jan 16 17:59:07.326000 audit[4845]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000106648 a2=98 a3=0 items=0 ppid=4832 pid=4845 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:07.326000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F6661653064356164376531383163623866373339353363346461306333 Jan 16 17:59:07.354000 audit[4878]: NETFILTER_CFG table=filter:139 family=2 entries=60 op=nft_register_chain pid=4878 subj=system_u:system_r:kernel_t:s0 comm="iptables-nft-re" Jan 16 17:59:07.354000 audit[4878]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=29916 a0=3 a1=ffffc5cb0150 a2=0 a3=ffff8329bfa8 items=0 ppid=4060 pid=4878 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-nft-re" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:07.354000 audit: PROCTITLE proctitle=69707461626C65732D6E66742D726573746F7265002D2D6E6F666C757368002D2D766572626F7365002D2D77616974003130002D2D776169742D696E74657276616C003530303030 Jan 16 17:59:07.379791 containerd[1547]: time="2026-01-16T17:59:07.379721456Z" level=info msg="RunPodSandbox for &PodSandboxMetadata{Name:goldmane-7c778bb748-pqdht,Uid:2eb1812e-29cb-4b14-8060-fe2f9a701433,Namespace:calico-system,Attempt:0,} returns sandbox id \"fae0d5ad7e181cb8f73953c4da0c3f4e51022cdb1f606755cfc01f46ae39ecc1\"" Jan 16 17:59:07.653920 containerd[1547]: time="2026-01-16T17:59:07.653237863Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 17:59:07.655370 containerd[1547]: time="2026-01-16T17:59:07.655178075Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 16 17:59:07.655370 containerd[1547]: time="2026-01-16T17:59:07.655265711Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 16 17:59:07.655766 kubelet[2825]: E0116 17:59:07.655706 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 16 17:59:07.655884 kubelet[2825]: E0116 17:59:07.655767 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 16 17:59:07.656356 kubelet[2825]: E0116 17:59:07.655962 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-94bfc95c-f6gcl_calico-system(c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 16 17:59:07.656572 containerd[1547]: time="2026-01-16T17:59:07.656461549Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 16 17:59:07.657308 kubelet[2825]: E0116 17:59:07.657215 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-94bfc95c-f6gcl" podUID="c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a" Jan 16 17:59:08.015709 containerd[1547]: time="2026-01-16T17:59:08.015631599Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 17:59:08.017243 containerd[1547]: time="2026-01-16T17:59:08.017159868Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 16 17:59:08.017384 containerd[1547]: time="2026-01-16T17:59:08.017303383Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 16 17:59:08.017662 kubelet[2825]: E0116 17:59:08.017612 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 16 17:59:08.017838 kubelet[2825]: E0116 17:59:08.017810 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 16 17:59:08.018039 kubelet[2825]: E0116 17:59:08.018014 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-pqdht_calico-system(2eb1812e-29cb-4b14-8060-fe2f9a701433): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 16 17:59:08.018241 kubelet[2825]: E0116 17:59:08.018184 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-pqdht" podUID="2eb1812e-29cb-4b14-8060-fe2f9a701433" Jan 16 17:59:08.050195 kubelet[2825]: E0116 17:59:08.050137 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-pqdht" podUID="2eb1812e-29cb-4b14-8060-fe2f9a701433" Jan 16 17:59:08.054275 kubelet[2825]: E0116 17:59:08.053784 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-94bfc95c-f6gcl" podUID="c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a" Jan 16 17:59:08.188335 systemd-networkd[1470]: cali50a16c933b7: Gained IPv6LL Jan 16 17:59:08.295000 audit[4889]: NETFILTER_CFG table=filter:140 family=2 entries=14 op=nft_register_rule pid=4889 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:59:08.295000 audit[4889]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffdf6989d0 a2=0 a3=1 items=0 ppid=2976 pid=4889 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:08.295000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:59:08.308000 audit[4889]: NETFILTER_CFG table=nat:141 family=2 entries=56 op=nft_register_chain pid=4889 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 17:59:08.308000 audit[4889]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=19860 a0=3 a1=ffffdf6989d0 a2=0 a3=1 items=0 ppid=2976 pid=4889 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 17:59:08.308000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 17:59:08.954767 systemd-networkd[1470]: calia9058c16c76: Gained IPv6LL Jan 16 17:59:09.056102 kubelet[2825]: E0116 17:59:09.055956 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-pqdht" podUID="2eb1812e-29cb-4b14-8060-fe2f9a701433" Jan 16 17:59:09.059280 kubelet[2825]: E0116 17:59:09.058986 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-94bfc95c-f6gcl" podUID="c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a" Jan 16 17:59:17.747811 containerd[1547]: time="2026-01-16T17:59:17.747740542Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 16 17:59:18.088478 containerd[1547]: time="2026-01-16T17:59:18.088187762Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 17:59:18.090575 containerd[1547]: time="2026-01-16T17:59:18.090495887Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 16 17:59:18.090863 containerd[1547]: time="2026-01-16T17:59:18.090812682Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 16 17:59:18.095580 kubelet[2825]: E0116 17:59:18.095460 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 17:59:18.095580 kubelet[2825]: E0116 17:59:18.095529 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 17:59:18.096090 kubelet[2825]: E0116 17:59:18.095627 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-67547c9f8f-2pf9b_calico-system(78e68302-7ac2-49f4-8488-479ee8eaf4c9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 16 17:59:18.098767 containerd[1547]: time="2026-01-16T17:59:18.098732680Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 16 17:59:18.492103 containerd[1547]: time="2026-01-16T17:59:18.491821564Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 17:59:18.493712 containerd[1547]: time="2026-01-16T17:59:18.493644456Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 16 17:59:18.494181 containerd[1547]: time="2026-01-16T17:59:18.493739535Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 16 17:59:18.494477 kubelet[2825]: E0116 17:59:18.494415 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 16 17:59:18.494720 kubelet[2825]: E0116 17:59:18.494677 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 16 17:59:18.495058 kubelet[2825]: E0116 17:59:18.494992 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-67547c9f8f-2pf9b_calico-system(78e68302-7ac2-49f4-8488-479ee8eaf4c9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 16 17:59:18.496461 kubelet[2825]: E0116 17:59:18.496415 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-67547c9f8f-2pf9b" podUID="78e68302-7ac2-49f4-8488-479ee8eaf4c9" Jan 16 17:59:19.747526 containerd[1547]: time="2026-01-16T17:59:19.747334395Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 16 17:59:20.090282 containerd[1547]: time="2026-01-16T17:59:20.089984894Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 17:59:20.091869 containerd[1547]: time="2026-01-16T17:59:20.091771151Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 16 17:59:20.092005 containerd[1547]: time="2026-01-16T17:59:20.091890030Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 16 17:59:20.092276 kubelet[2825]: E0116 17:59:20.092191 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 16 17:59:20.092276 kubelet[2825]: E0116 17:59:20.092262 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 16 17:59:20.092771 kubelet[2825]: E0116 17:59:20.092478 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-94bfc95c-f6gcl_calico-system(c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 16 17:59:20.092771 kubelet[2825]: E0116 17:59:20.092518 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-94bfc95c-f6gcl" podUID="c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a" Jan 16 17:59:20.092954 containerd[1547]: time="2026-01-16T17:59:20.092758299Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 16 17:59:20.434475 containerd[1547]: time="2026-01-16T17:59:20.434293181Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 17:59:20.435821 containerd[1547]: time="2026-01-16T17:59:20.435754443Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 16 17:59:20.435969 containerd[1547]: time="2026-01-16T17:59:20.435911641Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 16 17:59:20.436172 kubelet[2825]: E0116 17:59:20.436122 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 17:59:20.436273 kubelet[2825]: E0116 17:59:20.436182 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 17:59:20.436613 kubelet[2825]: E0116 17:59:20.436360 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-767f75bcf4-k622k_calico-apiserver(e2b5c1b0-147f-4d63-9d4a-0d66468ad158): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 16 17:59:20.437220 kubelet[2825]: E0116 17:59:20.436639 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-k622k" podUID="e2b5c1b0-147f-4d63-9d4a-0d66468ad158" Jan 16 17:59:20.749310 containerd[1547]: time="2026-01-16T17:59:20.748483682Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 16 17:59:21.087892 containerd[1547]: time="2026-01-16T17:59:21.087667352Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 17:59:21.090015 containerd[1547]: time="2026-01-16T17:59:21.089950607Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 16 17:59:21.090824 containerd[1547]: time="2026-01-16T17:59:21.090053685Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 16 17:59:21.090928 kubelet[2825]: E0116 17:59:21.090189 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 16 17:59:21.090928 kubelet[2825]: E0116 17:59:21.090227 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 16 17:59:21.090928 kubelet[2825]: E0116 17:59:21.090297 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-lqf7c_calico-system(8f14bccb-f353-467f-b549-674ae9114a0e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 16 17:59:21.092657 containerd[1547]: time="2026-01-16T17:59:21.092612177Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 16 17:59:21.445698 containerd[1547]: time="2026-01-16T17:59:21.445629612Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 17:59:21.447443 containerd[1547]: time="2026-01-16T17:59:21.447365033Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 16 17:59:21.447708 containerd[1547]: time="2026-01-16T17:59:21.447410392Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 16 17:59:21.447880 kubelet[2825]: E0116 17:59:21.447782 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 16 17:59:21.448453 kubelet[2825]: E0116 17:59:21.447893 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 16 17:59:21.448453 kubelet[2825]: E0116 17:59:21.447997 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-lqf7c_calico-system(8f14bccb-f353-467f-b549-674ae9114a0e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 16 17:59:21.448453 kubelet[2825]: E0116 17:59:21.448055 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lqf7c" podUID="8f14bccb-f353-467f-b549-674ae9114a0e" Jan 16 17:59:21.748681 containerd[1547]: time="2026-01-16T17:59:21.747642568Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 16 17:59:22.079746 containerd[1547]: time="2026-01-16T17:59:22.079588254Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 17:59:22.081330 containerd[1547]: time="2026-01-16T17:59:22.081169679Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 16 17:59:22.081330 containerd[1547]: time="2026-01-16T17:59:22.081249278Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 16 17:59:22.081724 kubelet[2825]: E0116 17:59:22.081678 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 17:59:22.081866 kubelet[2825]: E0116 17:59:22.081730 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 17:59:22.081935 kubelet[2825]: E0116 17:59:22.081813 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-767f75bcf4-xk99t_calico-apiserver(d584925c-30a0-4760-b407-e5a8a40d9f3d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 16 17:59:22.081935 kubelet[2825]: E0116 17:59:22.081908 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-xk99t" podUID="d584925c-30a0-4760-b407-e5a8a40d9f3d" Jan 16 17:59:22.749472 containerd[1547]: time="2026-01-16T17:59:22.749360234Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 16 17:59:23.090888 containerd[1547]: time="2026-01-16T17:59:23.090615653Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 17:59:23.093271 containerd[1547]: time="2026-01-16T17:59:23.093154952Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 16 17:59:23.093271 containerd[1547]: time="2026-01-16T17:59:23.093198472Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 16 17:59:23.093445 kubelet[2825]: E0116 17:59:23.093401 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 16 17:59:23.094169 kubelet[2825]: E0116 17:59:23.093455 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 16 17:59:23.094169 kubelet[2825]: E0116 17:59:23.093574 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-pqdht_calico-system(2eb1812e-29cb-4b14-8060-fe2f9a701433): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 16 17:59:23.094169 kubelet[2825]: E0116 17:59:23.093616 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-pqdht" podUID="2eb1812e-29cb-4b14-8060-fe2f9a701433" Jan 16 17:59:32.750768 kubelet[2825]: E0116 17:59:32.750413 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-xk99t" podUID="d584925c-30a0-4760-b407-e5a8a40d9f3d" Jan 16 17:59:32.753658 kubelet[2825]: E0116 17:59:32.753579 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-67547c9f8f-2pf9b" podUID="78e68302-7ac2-49f4-8488-479ee8eaf4c9" Jan 16 17:59:32.755566 kubelet[2825]: E0116 17:59:32.755255 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lqf7c" podUID="8f14bccb-f353-467f-b549-674ae9114a0e" Jan 16 17:59:33.752585 kubelet[2825]: E0116 17:59:33.751229 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-94bfc95c-f6gcl" podUID="c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a" Jan 16 17:59:33.754507 kubelet[2825]: E0116 17:59:33.754370 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-k622k" podUID="e2b5c1b0-147f-4d63-9d4a-0d66468ad158" Jan 16 17:59:37.748541 kubelet[2825]: E0116 17:59:37.748173 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-pqdht" podUID="2eb1812e-29cb-4b14-8060-fe2f9a701433" Jan 16 17:59:41.013099 kernel: kauditd_printk_skb: 161 callbacks suppressed Jan 16 17:59:41.013214 kernel: audit: type=1130 audit(1768586381.008:734): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-49.12.189.56:22-80.94.95.115:31026 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:41.008000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-49.12.189.56:22-80.94.95.115:31026 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:41.010299 systemd[1]: Started sshd@8-49.12.189.56:22-80.94.95.115:31026.service - OpenSSH per-connection server daemon (80.94.95.115:31026). Jan 16 17:59:43.621976 sshd[4946]: Invalid user admin from 80.94.95.115 port 31026 Jan 16 17:59:43.733268 sshd[4946]: Connection closed by invalid user admin 80.94.95.115 port 31026 [preauth] Jan 16 17:59:43.732000 audit[4946]: USER_ERR pid=4946 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=80.94.95.115 addr=80.94.95.115 terminal=ssh res=failed' Jan 16 17:59:43.739868 systemd[1]: sshd@8-49.12.189.56:22-80.94.95.115:31026.service: Deactivated successfully. Jan 16 17:59:43.738000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-49.12.189.56:22-80.94.95.115:31026 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:43.743343 kernel: audit: type=1109 audit(1768586383.732:735): pid=4946 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:bad_ident grantors=? acct="?" exe="/usr/lib64/misc/sshd-session" hostname=80.94.95.115 addr=80.94.95.115 terminal=ssh res=failed' Jan 16 17:59:43.743422 kernel: audit: type=1131 audit(1768586383.738:736): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@8-49.12.189.56:22-80.94.95.115:31026 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 17:59:43.748773 containerd[1547]: time="2026-01-16T17:59:43.748734795Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 16 17:59:44.106965 containerd[1547]: time="2026-01-16T17:59:44.106908000Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 17:59:44.108807 containerd[1547]: time="2026-01-16T17:59:44.108527619Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 16 17:59:44.108807 containerd[1547]: time="2026-01-16T17:59:44.108630100Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 16 17:59:44.108931 kubelet[2825]: E0116 17:59:44.108871 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 16 17:59:44.108931 kubelet[2825]: E0116 17:59:44.108913 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 16 17:59:44.109208 kubelet[2825]: E0116 17:59:44.109114 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-lqf7c_calico-system(8f14bccb-f353-467f-b549-674ae9114a0e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 16 17:59:44.109741 containerd[1547]: time="2026-01-16T17:59:44.109698792Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 16 17:59:44.449765 containerd[1547]: time="2026-01-16T17:59:44.449569037Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 17:59:44.450692 containerd[1547]: time="2026-01-16T17:59:44.450658089Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 16 17:59:44.450934 containerd[1547]: time="2026-01-16T17:59:44.450873052Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 16 17:59:44.451230 kubelet[2825]: E0116 17:59:44.451159 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 17:59:44.451230 kubelet[2825]: E0116 17:59:44.451214 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 17:59:44.451531 kubelet[2825]: E0116 17:59:44.451511 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-67547c9f8f-2pf9b_calico-system(78e68302-7ac2-49f4-8488-479ee8eaf4c9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 16 17:59:44.452756 containerd[1547]: time="2026-01-16T17:59:44.451867143Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 16 17:59:44.791178 containerd[1547]: time="2026-01-16T17:59:44.791072900Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 17:59:44.793010 containerd[1547]: time="2026-01-16T17:59:44.792875161Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 16 17:59:44.793010 containerd[1547]: time="2026-01-16T17:59:44.792967882Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 16 17:59:44.793780 kubelet[2825]: E0116 17:59:44.793727 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 16 17:59:44.793780 kubelet[2825]: E0116 17:59:44.793781 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 16 17:59:44.794087 kubelet[2825]: E0116 17:59:44.793950 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-lqf7c_calico-system(8f14bccb-f353-467f-b549-674ae9114a0e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 16 17:59:44.794087 kubelet[2825]: E0116 17:59:44.793997 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lqf7c" podUID="8f14bccb-f353-467f-b549-674ae9114a0e" Jan 16 17:59:44.794363 containerd[1547]: time="2026-01-16T17:59:44.794340738Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 16 17:59:45.137002 containerd[1547]: time="2026-01-16T17:59:45.136743341Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 17:59:45.138574 containerd[1547]: time="2026-01-16T17:59:45.138390481Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 16 17:59:45.138574 containerd[1547]: time="2026-01-16T17:59:45.138504642Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 16 17:59:45.139074 kubelet[2825]: E0116 17:59:45.138905 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 16 17:59:45.139074 kubelet[2825]: E0116 17:59:45.138954 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 16 17:59:45.139995 kubelet[2825]: E0116 17:59:45.139420 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-67547c9f8f-2pf9b_calico-system(78e68302-7ac2-49f4-8488-479ee8eaf4c9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 16 17:59:45.140139 kubelet[2825]: E0116 17:59:45.140085 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-67547c9f8f-2pf9b" podUID="78e68302-7ac2-49f4-8488-479ee8eaf4c9" Jan 16 17:59:46.751671 containerd[1547]: time="2026-01-16T17:59:46.751352519Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 16 17:59:47.085328 containerd[1547]: time="2026-01-16T17:59:47.084936053Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 17:59:47.087009 containerd[1547]: time="2026-01-16T17:59:47.086871039Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 16 17:59:47.087284 kubelet[2825]: E0116 17:59:47.087114 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 16 17:59:47.087284 kubelet[2825]: E0116 17:59:47.087171 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 16 17:59:47.087960 kubelet[2825]: E0116 17:59:47.087416 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-94bfc95c-f6gcl_calico-system(c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 16 17:59:47.087960 kubelet[2825]: E0116 17:59:47.087461 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-94bfc95c-f6gcl" podUID="c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a" Jan 16 17:59:47.088434 containerd[1547]: time="2026-01-16T17:59:47.086946360Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 16 17:59:47.088434 containerd[1547]: time="2026-01-16T17:59:47.088361499Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 16 17:59:47.435411 containerd[1547]: time="2026-01-16T17:59:47.434672158Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 17:59:47.436530 containerd[1547]: time="2026-01-16T17:59:47.436476982Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 16 17:59:47.436752 containerd[1547]: time="2026-01-16T17:59:47.436585944Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 16 17:59:47.436860 kubelet[2825]: E0116 17:59:47.436811 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 17:59:47.436943 kubelet[2825]: E0116 17:59:47.436867 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 17:59:47.437224 kubelet[2825]: E0116 17:59:47.436955 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-767f75bcf4-xk99t_calico-apiserver(d584925c-30a0-4760-b407-e5a8a40d9f3d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 16 17:59:47.437224 kubelet[2825]: E0116 17:59:47.436996 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-xk99t" podUID="d584925c-30a0-4760-b407-e5a8a40d9f3d" Jan 16 17:59:48.752000 containerd[1547]: time="2026-01-16T17:59:48.751924446Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 16 17:59:49.085923 containerd[1547]: time="2026-01-16T17:59:49.085522182Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 17:59:49.087291 containerd[1547]: time="2026-01-16T17:59:49.087229807Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 16 17:59:49.087794 containerd[1547]: time="2026-01-16T17:59:49.087256207Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 16 17:59:49.088089 kubelet[2825]: E0116 17:59:49.087884 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 17:59:49.088089 kubelet[2825]: E0116 17:59:49.087942 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 17:59:49.089618 kubelet[2825]: E0116 17:59:49.089043 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-767f75bcf4-k622k_calico-apiserver(e2b5c1b0-147f-4d63-9d4a-0d66468ad158): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 16 17:59:49.089835 kubelet[2825]: E0116 17:59:49.089794 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-k622k" podUID="e2b5c1b0-147f-4d63-9d4a-0d66468ad158" Jan 16 17:59:51.750228 containerd[1547]: time="2026-01-16T17:59:51.749470926Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 16 17:59:52.085945 containerd[1547]: time="2026-01-16T17:59:52.085471255Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 17:59:52.087604 containerd[1547]: time="2026-01-16T17:59:52.087381366Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 16 17:59:52.087604 containerd[1547]: time="2026-01-16T17:59:52.087507888Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 16 17:59:52.087971 kubelet[2825]: E0116 17:59:52.087869 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 16 17:59:52.087971 kubelet[2825]: E0116 17:59:52.087944 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 16 17:59:52.088696 kubelet[2825]: E0116 17:59:52.088078 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-pqdht_calico-system(2eb1812e-29cb-4b14-8060-fe2f9a701433): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 16 17:59:52.088696 kubelet[2825]: E0116 17:59:52.088140 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-pqdht" podUID="2eb1812e-29cb-4b14-8060-fe2f9a701433" Jan 16 17:59:57.749276 kubelet[2825]: E0116 17:59:57.749177 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-67547c9f8f-2pf9b" podUID="78e68302-7ac2-49f4-8488-479ee8eaf4c9" Jan 16 17:59:58.753439 kubelet[2825]: E0116 17:59:58.753352 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-94bfc95c-f6gcl" podUID="c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a" Jan 16 17:59:59.747135 kubelet[2825]: E0116 17:59:59.746988 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-xk99t" podUID="d584925c-30a0-4760-b407-e5a8a40d9f3d" Jan 16 17:59:59.748716 kubelet[2825]: E0116 17:59:59.748655 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lqf7c" podUID="8f14bccb-f353-467f-b549-674ae9114a0e" Jan 16 18:00:00.749731 kubelet[2825]: E0116 18:00:00.749260 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-k622k" podUID="e2b5c1b0-147f-4d63-9d4a-0d66468ad158" Jan 16 18:00:06.747937 kubelet[2825]: E0116 18:00:06.747832 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-pqdht" podUID="2eb1812e-29cb-4b14-8060-fe2f9a701433" Jan 16 18:00:09.749025 kubelet[2825]: E0116 18:00:09.748412 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-67547c9f8f-2pf9b" podUID="78e68302-7ac2-49f4-8488-479ee8eaf4c9" Jan 16 18:00:10.754563 kubelet[2825]: E0116 18:00:10.754123 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-94bfc95c-f6gcl" podUID="c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a" Jan 16 18:00:11.749270 kubelet[2825]: E0116 18:00:11.748720 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-xk99t" podUID="d584925c-30a0-4760-b407-e5a8a40d9f3d" Jan 16 18:00:12.750216 kubelet[2825]: E0116 18:00:12.749956 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lqf7c" podUID="8f14bccb-f353-467f-b549-674ae9114a0e" Jan 16 18:00:15.747299 kubelet[2825]: E0116 18:00:15.747189 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-k622k" podUID="e2b5c1b0-147f-4d63-9d4a-0d66468ad158" Jan 16 18:00:18.749694 kubelet[2825]: E0116 18:00:18.749224 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-pqdht" podUID="2eb1812e-29cb-4b14-8060-fe2f9a701433" Jan 16 18:00:22.747453 kubelet[2825]: E0116 18:00:22.747353 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-xk99t" podUID="d584925c-30a0-4760-b407-e5a8a40d9f3d" Jan 16 18:00:23.747812 kubelet[2825]: E0116 18:00:23.747772 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-94bfc95c-f6gcl" podUID="c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a" Jan 16 18:00:24.751214 containerd[1547]: time="2026-01-16T18:00:24.751159771Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 16 18:00:25.102978 containerd[1547]: time="2026-01-16T18:00:25.102370821Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:00:25.104846 containerd[1547]: time="2026-01-16T18:00:25.104631802Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 16 18:00:25.104846 containerd[1547]: time="2026-01-16T18:00:25.104674483Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 16 18:00:25.105258 kubelet[2825]: E0116 18:00:25.105208 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 18:00:25.105258 kubelet[2825]: E0116 18:00:25.105258 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 18:00:25.106814 kubelet[2825]: E0116 18:00:25.105365 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-67547c9f8f-2pf9b_calico-system(78e68302-7ac2-49f4-8488-479ee8eaf4c9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 16 18:00:25.107497 kubelet[2825]: E0116 18:00:25.107403 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-67547c9f8f-2pf9b" podUID="78e68302-7ac2-49f4-8488-479ee8eaf4c9" Jan 16 18:00:26.748705 kubelet[2825]: E0116 18:00:26.748311 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-k622k" podUID="e2b5c1b0-147f-4d63-9d4a-0d66468ad158" Jan 16 18:00:27.747905 containerd[1547]: time="2026-01-16T18:00:27.747253894Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\"" Jan 16 18:00:28.151978 containerd[1547]: time="2026-01-16T18:00:28.151180930Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:00:28.153460 containerd[1547]: time="2026-01-16T18:00:28.153319389Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/csi:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" Jan 16 18:00:28.153460 containerd[1547]: time="2026-01-16T18:00:28.153385710Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/csi:v3.30.4: active requests=0, bytes read=0" Jan 16 18:00:28.153685 kubelet[2825]: E0116 18:00:28.153595 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 16 18:00:28.153685 kubelet[2825]: E0116 18:00:28.153640 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" image="ghcr.io/flatcar/calico/csi:v3.30.4" Jan 16 18:00:28.154072 kubelet[2825]: E0116 18:00:28.153714 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-csi start failed in pod csi-node-driver-lqf7c_calico-system(8f14bccb-f353-467f-b549-674ae9114a0e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/csi:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found" logger="UnhandledError" Jan 16 18:00:28.155638 containerd[1547]: time="2026-01-16T18:00:28.154912432Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\"" Jan 16 18:00:28.518325 containerd[1547]: time="2026-01-16T18:00:28.518254882Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:00:28.521573 containerd[1547]: time="2026-01-16T18:00:28.519985529Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" Jan 16 18:00:28.521573 containerd[1547]: time="2026-01-16T18:00:28.520084372Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: active requests=0, bytes read=0" Jan 16 18:00:28.521920 kubelet[2825]: E0116 18:00:28.521869 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 16 18:00:28.522070 kubelet[2825]: E0116 18:00:28.522011 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" image="ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4" Jan 16 18:00:28.522175 kubelet[2825]: E0116 18:00:28.522158 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container csi-node-driver-registrar start failed in pod csi-node-driver-lqf7c_calico-system(8f14bccb-f353-467f-b549-674ae9114a0e): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found" logger="UnhandledError" Jan 16 18:00:28.522295 kubelet[2825]: E0116 18:00:28.522267 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lqf7c" podUID="8f14bccb-f353-467f-b549-674ae9114a0e" Jan 16 18:00:29.747025 kubelet[2825]: E0116 18:00:29.746966 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-pqdht" podUID="2eb1812e-29cb-4b14-8060-fe2f9a701433" Jan 16 18:00:37.749522 containerd[1547]: time="2026-01-16T18:00:37.749356477Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\"" Jan 16 18:00:38.100333 containerd[1547]: time="2026-01-16T18:00:38.099957403Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:00:38.101288 containerd[1547]: time="2026-01-16T18:00:38.101198279Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" Jan 16 18:00:38.101376 containerd[1547]: time="2026-01-16T18:00:38.101299082Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/kube-controllers:v3.30.4: active requests=0, bytes read=0" Jan 16 18:00:38.101689 kubelet[2825]: E0116 18:00:38.101645 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 16 18:00:38.102618 kubelet[2825]: E0116 18:00:38.101706 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" image="ghcr.io/flatcar/calico/kube-controllers:v3.30.4" Jan 16 18:00:38.102618 kubelet[2825]: E0116 18:00:38.102295 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-kube-controllers start failed in pod calico-kube-controllers-94bfc95c-f6gcl_calico-system(c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found" logger="UnhandledError" Jan 16 18:00:38.102618 kubelet[2825]: E0116 18:00:38.102353 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-94bfc95c-f6gcl" podUID="c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a" Jan 16 18:00:38.102734 containerd[1547]: time="2026-01-16T18:00:38.102516477Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 16 18:00:38.508615 containerd[1547]: time="2026-01-16T18:00:38.508496865Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:00:38.510743 containerd[1547]: time="2026-01-16T18:00:38.510677727Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 16 18:00:38.510987 containerd[1547]: time="2026-01-16T18:00:38.510710608Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 16 18:00:38.511178 kubelet[2825]: E0116 18:00:38.511140 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 18:00:38.511178 kubelet[2825]: E0116 18:00:38.511214 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 18:00:38.511480 kubelet[2825]: E0116 18:00:38.511398 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-767f75bcf4-xk99t_calico-apiserver(d584925c-30a0-4760-b407-e5a8a40d9f3d): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 16 18:00:38.511686 kubelet[2825]: E0116 18:00:38.511575 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-xk99t" podUID="d584925c-30a0-4760-b407-e5a8a40d9f3d" Jan 16 18:00:38.753515 containerd[1547]: time="2026-01-16T18:00:38.753452025Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\"" Jan 16 18:00:39.089585 containerd[1547]: time="2026-01-16T18:00:39.089272607Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:00:39.090883 containerd[1547]: time="2026-01-16T18:00:39.090768570Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/apiserver:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" Jan 16 18:00:39.090998 containerd[1547]: time="2026-01-16T18:00:39.090856053Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/apiserver:v3.30.4: active requests=0, bytes read=0" Jan 16 18:00:39.091406 kubelet[2825]: E0116 18:00:39.091205 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 18:00:39.091406 kubelet[2825]: E0116 18:00:39.091252 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" image="ghcr.io/flatcar/calico/apiserver:v3.30.4" Jan 16 18:00:39.091406 kubelet[2825]: E0116 18:00:39.091337 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container calico-apiserver start failed in pod calico-apiserver-767f75bcf4-k622k_calico-apiserver(e2b5c1b0-147f-4d63-9d4a-0d66468ad158): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/apiserver:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found" logger="UnhandledError" Jan 16 18:00:39.091406 kubelet[2825]: E0116 18:00:39.091369 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-k622k" podUID="e2b5c1b0-147f-4d63-9d4a-0d66468ad158" Jan 16 18:00:39.748515 containerd[1547]: time="2026-01-16T18:00:39.748204143Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\"" Jan 16 18:00:40.108397 containerd[1547]: time="2026-01-16T18:00:40.108178143Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:00:40.109642 containerd[1547]: time="2026-01-16T18:00:40.109523062Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" Jan 16 18:00:40.109926 containerd[1547]: time="2026-01-16T18:00:40.109576344Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker-backend:v3.30.4: active requests=0, bytes read=0" Jan 16 18:00:40.110023 kubelet[2825]: E0116 18:00:40.109971 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 16 18:00:40.110616 kubelet[2825]: E0116 18:00:40.110026 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker-backend:v3.30.4" Jan 16 18:00:40.110616 kubelet[2825]: E0116 18:00:40.110104 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker-backend start failed in pod whisker-67547c9f8f-2pf9b_calico-system(78e68302-7ac2-49f4-8488-479ee8eaf4c9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found" logger="UnhandledError" Jan 16 18:00:40.110616 kubelet[2825]: E0116 18:00:40.110143 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-67547c9f8f-2pf9b" podUID="78e68302-7ac2-49f4-8488-479ee8eaf4c9" Jan 16 18:00:41.749357 containerd[1547]: time="2026-01-16T18:00:41.748723917Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\"" Jan 16 18:00:41.751326 kubelet[2825]: E0116 18:00:41.751221 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lqf7c" podUID="8f14bccb-f353-467f-b549-674ae9114a0e" Jan 16 18:00:41.903000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-49.12.189.56:22-68.220.241.50:36300 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:00:41.903890 systemd[1]: Started sshd@9-49.12.189.56:22-68.220.241.50:36300.service - OpenSSH per-connection server daemon (68.220.241.50:36300). Jan 16 18:00:41.908723 kernel: audit: type=1130 audit(1768586441.903:737): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-49.12.189.56:22-68.220.241.50:36300 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:00:42.104899 containerd[1547]: time="2026-01-16T18:00:42.104768843Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:00:42.108434 containerd[1547]: time="2026-01-16T18:00:42.108372148Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/goldmane:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" Jan 16 18:00:42.108572 containerd[1547]: time="2026-01-16T18:00:42.108481592Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/goldmane:v3.30.4: active requests=0, bytes read=0" Jan 16 18:00:42.109012 kubelet[2825]: E0116 18:00:42.108971 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 16 18:00:42.109092 kubelet[2825]: E0116 18:00:42.109019 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" image="ghcr.io/flatcar/calico/goldmane:v3.30.4" Jan 16 18:00:42.109123 kubelet[2825]: E0116 18:00:42.109088 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container goldmane start failed in pod goldmane-7c778bb748-pqdht_calico-system(2eb1812e-29cb-4b14-8060-fe2f9a701433): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/goldmane:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found" logger="UnhandledError" Jan 16 18:00:42.109150 kubelet[2825]: E0116 18:00:42.109120 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-pqdht" podUID="2eb1812e-29cb-4b14-8060-fe2f9a701433" Jan 16 18:00:42.459000 audit[5052]: USER_ACCT pid=5052 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:42.463253 sshd[5052]: Accepted publickey for core from 68.220.241.50 port 36300 ssh2: RSA SHA256:9+z26MN7AwYt4+Fu/36id5j/He/wK/R0J/NBrgFPGqc Jan 16 18:00:42.463649 kernel: audit: type=1101 audit(1768586442.459:738): pid=5052 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:42.464000 audit[5052]: CRED_ACQ pid=5052 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:42.467583 sshd-session[5052]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:00:42.468956 kernel: audit: type=1103 audit(1768586442.464:739): pid=5052 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:42.469374 kernel: audit: type=1006 audit(1768586442.464:740): pid=5052 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=9 res=1 Jan 16 18:00:42.464000 audit[5052]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc8730a90 a2=3 a3=0 items=0 ppid=1 pid=5052 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:42.472497 kernel: audit: type=1300 audit(1768586442.464:740): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc8730a90 a2=3 a3=0 items=0 ppid=1 pid=5052 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=9 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:42.474575 kernel: audit: type=1327 audit(1768586442.464:740): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:00:42.464000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:00:42.481431 systemd-logind[1532]: New session 9 of user core. Jan 16 18:00:42.487771 systemd[1]: Started session-9.scope - Session 9 of User core. Jan 16 18:00:42.495000 audit[5052]: USER_START pid=5052 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:42.500000 audit[5057]: CRED_ACQ pid=5057 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:42.503271 kernel: audit: type=1105 audit(1768586442.495:741): pid=5052 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:42.503357 kernel: audit: type=1103 audit(1768586442.500:742): pid=5057 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:42.904902 sshd[5057]: Connection closed by 68.220.241.50 port 36300 Jan 16 18:00:42.906772 sshd-session[5052]: pam_unix(sshd:session): session closed for user core Jan 16 18:00:42.909000 audit[5052]: USER_END pid=5052 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:42.913693 systemd[1]: session-9.scope: Deactivated successfully. Jan 16 18:00:42.909000 audit[5052]: CRED_DISP pid=5052 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:42.916541 kernel: audit: type=1106 audit(1768586442.909:743): pid=5052 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:42.916637 kernel: audit: type=1104 audit(1768586442.909:744): pid=5052 uid=0 auid=500 ses=9 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:42.916999 systemd-logind[1532]: Session 9 logged out. Waiting for processes to exit. Jan 16 18:00:42.917217 systemd[1]: sshd@9-49.12.189.56:22-68.220.241.50:36300.service: Deactivated successfully. Jan 16 18:00:42.916000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@9-49.12.189.56:22-68.220.241.50:36300 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:00:42.925370 systemd-logind[1532]: Removed session 9. Jan 16 18:00:48.021000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-49.12.189.56:22-68.220.241.50:35258 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:00:48.021891 systemd[1]: Started sshd@10-49.12.189.56:22-68.220.241.50:35258.service - OpenSSH per-connection server daemon (68.220.241.50:35258). Jan 16 18:00:48.025614 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 16 18:00:48.025703 kernel: audit: type=1130 audit(1768586448.021:746): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-49.12.189.56:22-68.220.241.50:35258 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:00:48.580000 audit[5074]: USER_ACCT pid=5074 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:48.581163 sshd[5074]: Accepted publickey for core from 68.220.241.50 port 35258 ssh2: RSA SHA256:9+z26MN7AwYt4+Fu/36id5j/He/wK/R0J/NBrgFPGqc Jan 16 18:00:48.585000 audit[5074]: CRED_ACQ pid=5074 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:48.588345 kernel: audit: type=1101 audit(1768586448.580:747): pid=5074 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:48.588390 kernel: audit: type=1103 audit(1768586448.585:748): pid=5074 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:48.589288 sshd-session[5074]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:00:48.591801 kernel: audit: type=1006 audit(1768586448.587:749): pid=5074 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=10 res=1 Jan 16 18:00:48.593513 kernel: audit: type=1300 audit(1768586448.587:749): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe9727090 a2=3 a3=0 items=0 ppid=1 pid=5074 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.587000 audit[5074]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe9727090 a2=3 a3=0 items=0 ppid=1 pid=5074 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=10 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:48.587000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:00:48.597586 kernel: audit: type=1327 audit(1768586448.587:749): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:00:48.599588 systemd-logind[1532]: New session 10 of user core. Jan 16 18:00:48.606116 systemd[1]: Started session-10.scope - Session 10 of User core. Jan 16 18:00:48.611000 audit[5074]: USER_START pid=5074 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:48.616585 kernel: audit: type=1105 audit(1768586448.611:750): pid=5074 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:48.617000 audit[5078]: CRED_ACQ pid=5078 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:48.620578 kernel: audit: type=1103 audit(1768586448.617:751): pid=5078 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:49.000229 sshd[5078]: Connection closed by 68.220.241.50 port 35258 Jan 16 18:00:49.000045 sshd-session[5074]: pam_unix(sshd:session): session closed for user core Jan 16 18:00:49.002000 audit[5074]: USER_END pid=5074 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:49.002000 audit[5074]: CRED_DISP pid=5074 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:49.012364 kernel: audit: type=1106 audit(1768586449.002:752): pid=5074 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:49.012486 kernel: audit: type=1104 audit(1768586449.002:753): pid=5074 uid=0 auid=500 ses=10 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:49.013993 systemd[1]: sshd@10-49.12.189.56:22-68.220.241.50:35258.service: Deactivated successfully. Jan 16 18:00:49.013000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@10-49.12.189.56:22-68.220.241.50:35258 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:00:49.018540 systemd[1]: session-10.scope: Deactivated successfully. Jan 16 18:00:49.022934 systemd-logind[1532]: Session 10 logged out. Waiting for processes to exit. Jan 16 18:00:49.024615 systemd-logind[1532]: Removed session 10. Jan 16 18:00:49.747291 kubelet[2825]: E0116 18:00:49.747249 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-94bfc95c-f6gcl" podUID="c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a" Jan 16 18:00:51.748066 kubelet[2825]: E0116 18:00:51.747979 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-67547c9f8f-2pf9b" podUID="78e68302-7ac2-49f4-8488-479ee8eaf4c9" Jan 16 18:00:52.750596 kubelet[2825]: E0116 18:00:52.750022 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-xk99t" podUID="d584925c-30a0-4760-b407-e5a8a40d9f3d" Jan 16 18:00:53.748870 kubelet[2825]: E0116 18:00:53.748773 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lqf7c" podUID="8f14bccb-f353-467f-b549-674ae9114a0e" Jan 16 18:00:53.748870 kubelet[2825]: E0116 18:00:53.748790 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-k622k" podUID="e2b5c1b0-147f-4d63-9d4a-0d66468ad158" Jan 16 18:00:54.115000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-49.12.189.56:22-68.220.241.50:58226 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:00:54.118292 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 16 18:00:54.118424 kernel: audit: type=1130 audit(1768586454.115:755): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-49.12.189.56:22-68.220.241.50:58226 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:00:54.117002 systemd[1]: Started sshd@11-49.12.189.56:22-68.220.241.50:58226.service - OpenSSH per-connection server daemon (68.220.241.50:58226). Jan 16 18:00:54.681000 audit[5090]: USER_ACCT pid=5090 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:54.685270 sshd[5090]: Accepted publickey for core from 68.220.241.50 port 58226 ssh2: RSA SHA256:9+z26MN7AwYt4+Fu/36id5j/He/wK/R0J/NBrgFPGqc Jan 16 18:00:54.690341 kernel: audit: type=1101 audit(1768586454.681:756): pid=5090 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:54.690467 kernel: audit: type=1103 audit(1768586454.684:757): pid=5090 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:54.684000 audit[5090]: CRED_ACQ pid=5090 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:54.687581 sshd-session[5090]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:00:54.693620 kernel: audit: type=1006 audit(1768586454.684:758): pid=5090 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=11 res=1 Jan 16 18:00:54.684000 audit[5090]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc068cc10 a2=3 a3=0 items=0 ppid=1 pid=5090 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:54.697001 kernel: audit: type=1300 audit(1768586454.684:758): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffc068cc10 a2=3 a3=0 items=0 ppid=1 pid=5090 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=11 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:54.698619 kernel: audit: type=1327 audit(1768586454.684:758): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:00:54.684000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:00:54.701618 systemd-logind[1532]: New session 11 of user core. Jan 16 18:00:54.708878 systemd[1]: Started session-11.scope - Session 11 of User core. Jan 16 18:00:54.713000 audit[5090]: USER_START pid=5090 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:54.717000 audit[5094]: CRED_ACQ pid=5094 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:54.721475 kernel: audit: type=1105 audit(1768586454.713:759): pid=5090 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:54.721641 kernel: audit: type=1103 audit(1768586454.717:760): pid=5094 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:55.098362 sshd[5094]: Connection closed by 68.220.241.50 port 58226 Jan 16 18:00:55.099994 sshd-session[5090]: pam_unix(sshd:session): session closed for user core Jan 16 18:00:55.100000 audit[5090]: USER_END pid=5090 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:55.106588 systemd-logind[1532]: Session 11 logged out. Waiting for processes to exit. Jan 16 18:00:55.100000 audit[5090]: CRED_DISP pid=5090 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:55.109006 systemd[1]: sshd@11-49.12.189.56:22-68.220.241.50:58226.service: Deactivated successfully. Jan 16 18:00:55.109769 kernel: audit: type=1106 audit(1768586455.100:761): pid=5090 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:55.109834 kernel: audit: type=1104 audit(1768586455.100:762): pid=5090 uid=0 auid=500 ses=11 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:55.106000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@11-49.12.189.56:22-68.220.241.50:58226 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:00:55.112653 systemd[1]: session-11.scope: Deactivated successfully. Jan 16 18:00:55.115723 systemd-logind[1532]: Removed session 11. Jan 16 18:00:55.205000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-49.12.189.56:22-68.220.241.50:58238 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:00:55.206831 systemd[1]: Started sshd@12-49.12.189.56:22-68.220.241.50:58238.service - OpenSSH per-connection server daemon (68.220.241.50:58238). Jan 16 18:00:55.776000 audit[5107]: USER_ACCT pid=5107 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:55.778869 sshd[5107]: Accepted publickey for core from 68.220.241.50 port 58238 ssh2: RSA SHA256:9+z26MN7AwYt4+Fu/36id5j/He/wK/R0J/NBrgFPGqc Jan 16 18:00:55.779000 audit[5107]: CRED_ACQ pid=5107 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:55.779000 audit[5107]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe54f7f80 a2=3 a3=0 items=0 ppid=1 pid=5107 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=12 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:55.779000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:00:55.781870 sshd-session[5107]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:00:55.789745 systemd-logind[1532]: New session 12 of user core. Jan 16 18:00:55.795736 systemd[1]: Started session-12.scope - Session 12 of User core. Jan 16 18:00:55.801000 audit[5107]: USER_START pid=5107 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:55.804000 audit[5111]: CRED_ACQ pid=5111 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:56.239696 sshd[5111]: Connection closed by 68.220.241.50 port 58238 Jan 16 18:00:56.240355 sshd-session[5107]: pam_unix(sshd:session): session closed for user core Jan 16 18:00:56.243000 audit[5107]: USER_END pid=5107 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:56.243000 audit[5107]: CRED_DISP pid=5107 uid=0 auid=500 ses=12 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:56.251051 systemd-logind[1532]: Session 12 logged out. Waiting for processes to exit. Jan 16 18:00:56.251777 systemd[1]: sshd@12-49.12.189.56:22-68.220.241.50:58238.service: Deactivated successfully. Jan 16 18:00:56.254405 systemd[1]: session-12.scope: Deactivated successfully. Jan 16 18:00:56.251000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@12-49.12.189.56:22-68.220.241.50:58238 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:00:56.260434 systemd-logind[1532]: Removed session 12. Jan 16 18:00:56.348028 systemd[1]: Started sshd@13-49.12.189.56:22-68.220.241.50:58254.service - OpenSSH per-connection server daemon (68.220.241.50:58254). Jan 16 18:00:56.346000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-49.12.189.56:22-68.220.241.50:58254 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:00:56.883000 audit[5122]: USER_ACCT pid=5122 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:56.885775 sshd[5122]: Accepted publickey for core from 68.220.241.50 port 58254 ssh2: RSA SHA256:9+z26MN7AwYt4+Fu/36id5j/He/wK/R0J/NBrgFPGqc Jan 16 18:00:56.885000 audit[5122]: CRED_ACQ pid=5122 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:56.885000 audit[5122]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffee475500 a2=3 a3=0 items=0 ppid=1 pid=5122 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=13 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:00:56.885000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:00:56.888428 sshd-session[5122]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:00:56.897618 systemd-logind[1532]: New session 13 of user core. Jan 16 18:00:56.902781 systemd[1]: Started session-13.scope - Session 13 of User core. Jan 16 18:00:56.905000 audit[5122]: USER_START pid=5122 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:56.908000 audit[5126]: CRED_ACQ pid=5126 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:57.275515 sshd[5126]: Connection closed by 68.220.241.50 port 58254 Jan 16 18:00:57.277778 sshd-session[5122]: pam_unix(sshd:session): session closed for user core Jan 16 18:00:57.285000 audit[5122]: USER_END pid=5122 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:57.285000 audit[5122]: CRED_DISP pid=5122 uid=0 auid=500 ses=13 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:00:57.294353 systemd[1]: sshd@13-49.12.189.56:22-68.220.241.50:58254.service: Deactivated successfully. Jan 16 18:00:57.295000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@13-49.12.189.56:22-68.220.241.50:58254 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:00:57.300320 systemd[1]: session-13.scope: Deactivated successfully. Jan 16 18:00:57.302841 systemd-logind[1532]: Session 13 logged out. Waiting for processes to exit. Jan 16 18:00:57.304287 systemd-logind[1532]: Removed session 13. Jan 16 18:00:57.748315 kubelet[2825]: E0116 18:00:57.748257 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-pqdht" podUID="2eb1812e-29cb-4b14-8060-fe2f9a701433" Jan 16 18:01:02.386446 systemd[1]: Started sshd@14-49.12.189.56:22-68.220.241.50:58262.service - OpenSSH per-connection server daemon (68.220.241.50:58262). Jan 16 18:01:02.385000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-49.12.189.56:22-68.220.241.50:58262 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:01:02.389814 kernel: kauditd_printk_skb: 23 callbacks suppressed Jan 16 18:01:02.389904 kernel: audit: type=1130 audit(1768586462.385:782): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-49.12.189.56:22-68.220.241.50:58262 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:01:02.747684 kubelet[2825]: E0116 18:01:02.747519 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-94bfc95c-f6gcl" podUID="c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a" Jan 16 18:01:02.925000 audit[5166]: USER_ACCT pid=5166 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:02.927995 sshd[5166]: Accepted publickey for core from 68.220.241.50 port 58262 ssh2: RSA SHA256:9+z26MN7AwYt4+Fu/36id5j/He/wK/R0J/NBrgFPGqc Jan 16 18:01:02.929573 kernel: audit: type=1101 audit(1768586462.925:783): pid=5166 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:02.928000 audit[5166]: CRED_ACQ pid=5166 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:02.930938 sshd-session[5166]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:01:02.933581 kernel: audit: type=1103 audit(1768586462.928:784): pid=5166 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:02.933656 kernel: audit: type=1006 audit(1768586462.928:785): pid=5166 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=14 res=1 Jan 16 18:01:02.928000 audit[5166]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe2d352e0 a2=3 a3=0 items=0 ppid=1 pid=5166 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:02.936212 kernel: audit: type=1300 audit(1768586462.928:785): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe2d352e0 a2=3 a3=0 items=0 ppid=1 pid=5166 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=14 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:02.936295 kernel: audit: type=1327 audit(1768586462.928:785): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:01:02.928000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:01:02.938999 systemd-logind[1532]: New session 14 of user core. Jan 16 18:01:02.947083 systemd[1]: Started session-14.scope - Session 14 of User core. Jan 16 18:01:02.950000 audit[5166]: USER_START pid=5166 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:02.956000 audit[5170]: CRED_ACQ pid=5170 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:02.960368 kernel: audit: type=1105 audit(1768586462.950:786): pid=5166 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:02.960475 kernel: audit: type=1103 audit(1768586462.956:787): pid=5170 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:03.368684 sshd[5170]: Connection closed by 68.220.241.50 port 58262 Jan 16 18:01:03.371775 sshd-session[5166]: pam_unix(sshd:session): session closed for user core Jan 16 18:01:03.371000 audit[5166]: USER_END pid=5166 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:03.376404 systemd[1]: sshd@14-49.12.189.56:22-68.220.241.50:58262.service: Deactivated successfully. Jan 16 18:01:03.372000 audit[5166]: CRED_DISP pid=5166 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:03.381021 kernel: audit: type=1106 audit(1768586463.371:788): pid=5166 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:03.381102 kernel: audit: type=1104 audit(1768586463.372:789): pid=5166 uid=0 auid=500 ses=14 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:03.379933 systemd[1]: session-14.scope: Deactivated successfully. Jan 16 18:01:03.376000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@14-49.12.189.56:22-68.220.241.50:58262 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:01:03.386289 systemd-logind[1532]: Session 14 logged out. Waiting for processes to exit. Jan 16 18:01:03.389255 systemd-logind[1532]: Removed session 14. Jan 16 18:01:03.750672 kubelet[2825]: E0116 18:01:03.750544 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-67547c9f8f-2pf9b" podUID="78e68302-7ac2-49f4-8488-479ee8eaf4c9" Jan 16 18:01:04.748317 kubelet[2825]: E0116 18:01:04.748242 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-xk99t" podUID="d584925c-30a0-4760-b407-e5a8a40d9f3d" Jan 16 18:01:05.747111 kubelet[2825]: E0116 18:01:05.747068 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-k622k" podUID="e2b5c1b0-147f-4d63-9d4a-0d66468ad158" Jan 16 18:01:06.751089 kubelet[2825]: E0116 18:01:06.750935 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lqf7c" podUID="8f14bccb-f353-467f-b549-674ae9114a0e" Jan 16 18:01:08.486319 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 16 18:01:08.486434 kernel: audit: type=1130 audit(1768586468.483:791): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-49.12.189.56:22-68.220.241.50:57200 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:01:08.483000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-49.12.189.56:22-68.220.241.50:57200 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:01:08.484047 systemd[1]: Started sshd@15-49.12.189.56:22-68.220.241.50:57200.service - OpenSSH per-connection server daemon (68.220.241.50:57200). Jan 16 18:01:09.041000 audit[5182]: USER_ACCT pid=5182 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:09.042779 sshd[5182]: Accepted publickey for core from 68.220.241.50 port 57200 ssh2: RSA SHA256:9+z26MN7AwYt4+Fu/36id5j/He/wK/R0J/NBrgFPGqc Jan 16 18:01:09.046896 sshd-session[5182]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:01:09.045000 audit[5182]: CRED_ACQ pid=5182 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:09.049004 kernel: audit: type=1101 audit(1768586469.041:792): pid=5182 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:09.049100 kernel: audit: type=1103 audit(1768586469.045:793): pid=5182 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:09.050943 kernel: audit: type=1006 audit(1768586469.045:794): pid=5182 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=15 res=1 Jan 16 18:01:09.051019 kernel: audit: type=1300 audit(1768586469.045:794): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff0ed7b40 a2=3 a3=0 items=0 ppid=1 pid=5182 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:09.045000 audit[5182]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff0ed7b40 a2=3 a3=0 items=0 ppid=1 pid=5182 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=15 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:09.045000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:01:09.055624 kernel: audit: type=1327 audit(1768586469.045:794): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:01:09.059853 systemd-logind[1532]: New session 15 of user core. Jan 16 18:01:09.066838 systemd[1]: Started session-15.scope - Session 15 of User core. Jan 16 18:01:09.072000 audit[5182]: USER_START pid=5182 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:09.076000 audit[5188]: CRED_ACQ pid=5188 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:09.078919 kernel: audit: type=1105 audit(1768586469.072:795): pid=5182 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:09.078989 kernel: audit: type=1103 audit(1768586469.076:796): pid=5188 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:09.462003 sshd[5188]: Connection closed by 68.220.241.50 port 57200 Jan 16 18:01:09.464412 sshd-session[5182]: pam_unix(sshd:session): session closed for user core Jan 16 18:01:09.468000 audit[5182]: USER_END pid=5182 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:09.475140 systemd[1]: sshd@15-49.12.189.56:22-68.220.241.50:57200.service: Deactivated successfully. Jan 16 18:01:09.468000 audit[5182]: CRED_DISP pid=5182 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:09.479699 kernel: audit: type=1106 audit(1768586469.468:797): pid=5182 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:09.479807 kernel: audit: type=1104 audit(1768586469.468:798): pid=5182 uid=0 auid=500 ses=15 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:09.482340 systemd[1]: session-15.scope: Deactivated successfully. Jan 16 18:01:09.474000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@15-49.12.189.56:22-68.220.241.50:57200 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:01:09.485007 systemd-logind[1532]: Session 15 logged out. Waiting for processes to exit. Jan 16 18:01:09.485945 systemd-logind[1532]: Removed session 15. Jan 16 18:01:09.751647 kubelet[2825]: E0116 18:01:09.750505 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-pqdht" podUID="2eb1812e-29cb-4b14-8060-fe2f9a701433" Jan 16 18:01:14.575878 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 16 18:01:14.575982 kernel: audit: type=1130 audit(1768586474.573:800): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-49.12.189.56:22-68.220.241.50:52356 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:01:14.573000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-49.12.189.56:22-68.220.241.50:52356 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:01:14.574028 systemd[1]: Started sshd@16-49.12.189.56:22-68.220.241.50:52356.service - OpenSSH per-connection server daemon (68.220.241.50:52356). Jan 16 18:01:14.751876 kubelet[2825]: E0116 18:01:14.751799 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-67547c9f8f-2pf9b" podUID="78e68302-7ac2-49f4-8488-479ee8eaf4c9" Jan 16 18:01:15.151000 audit[5200]: USER_ACCT pid=5200 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:15.155789 sshd[5200]: Accepted publickey for core from 68.220.241.50 port 52356 ssh2: RSA SHA256:9+z26MN7AwYt4+Fu/36id5j/He/wK/R0J/NBrgFPGqc Jan 16 18:01:15.157604 kernel: audit: type=1101 audit(1768586475.151:801): pid=5200 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:15.157000 audit[5200]: CRED_ACQ pid=5200 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:15.162686 sshd-session[5200]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:01:15.166589 kernel: audit: type=1103 audit(1768586475.157:802): pid=5200 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:15.166773 kernel: audit: type=1006 audit(1768586475.161:803): pid=5200 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=16 res=1 Jan 16 18:01:15.166797 kernel: audit: type=1300 audit(1768586475.161:803): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff49ee740 a2=3 a3=0 items=0 ppid=1 pid=5200 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:15.161000 audit[5200]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff49ee740 a2=3 a3=0 items=0 ppid=1 pid=5200 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=16 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:15.168638 kernel: audit: type=1327 audit(1768586475.161:803): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:01:15.161000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:01:15.175726 systemd-logind[1532]: New session 16 of user core. Jan 16 18:01:15.179798 systemd[1]: Started session-16.scope - Session 16 of User core. Jan 16 18:01:15.186000 audit[5200]: USER_START pid=5200 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:15.191000 audit[5204]: CRED_ACQ pid=5204 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:15.194426 kernel: audit: type=1105 audit(1768586475.186:804): pid=5200 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:15.194526 kernel: audit: type=1103 audit(1768586475.191:805): pid=5204 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:15.552990 sshd[5204]: Connection closed by 68.220.241.50 port 52356 Jan 16 18:01:15.554067 sshd-session[5200]: pam_unix(sshd:session): session closed for user core Jan 16 18:01:15.557000 audit[5200]: USER_END pid=5200 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:15.560000 audit[5200]: CRED_DISP pid=5200 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:15.563567 kernel: audit: type=1106 audit(1768586475.557:806): pid=5200 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:15.563655 kernel: audit: type=1104 audit(1768586475.560:807): pid=5200 uid=0 auid=500 ses=16 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:15.566918 systemd-logind[1532]: Session 16 logged out. Waiting for processes to exit. Jan 16 18:01:15.567743 systemd[1]: sshd@16-49.12.189.56:22-68.220.241.50:52356.service: Deactivated successfully. Jan 16 18:01:15.569000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@16-49.12.189.56:22-68.220.241.50:52356 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:01:15.572767 systemd[1]: session-16.scope: Deactivated successfully. Jan 16 18:01:15.575799 systemd-logind[1532]: Removed session 16. Jan 16 18:01:15.659000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-49.12.189.56:22-68.220.241.50:52360 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:01:15.659923 systemd[1]: Started sshd@17-49.12.189.56:22-68.220.241.50:52360.service - OpenSSH per-connection server daemon (68.220.241.50:52360). Jan 16 18:01:15.747251 kubelet[2825]: E0116 18:01:15.746743 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-94bfc95c-f6gcl" podUID="c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a" Jan 16 18:01:16.217000 audit[5218]: USER_ACCT pid=5218 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:16.218197 sshd[5218]: Accepted publickey for core from 68.220.241.50 port 52360 ssh2: RSA SHA256:9+z26MN7AwYt4+Fu/36id5j/He/wK/R0J/NBrgFPGqc Jan 16 18:01:16.220000 audit[5218]: CRED_ACQ pid=5218 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:16.220000 audit[5218]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd1df5040 a2=3 a3=0 items=0 ppid=1 pid=5218 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=17 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:16.220000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:01:16.222789 sshd-session[5218]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:01:16.228120 systemd-logind[1532]: New session 17 of user core. Jan 16 18:01:16.232765 systemd[1]: Started session-17.scope - Session 17 of User core. Jan 16 18:01:16.236000 audit[5218]: USER_START pid=5218 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:16.240000 audit[5222]: CRED_ACQ pid=5222 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:16.753379 kubelet[2825]: E0116 18:01:16.752411 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-xk99t" podUID="d584925c-30a0-4760-b407-e5a8a40d9f3d" Jan 16 18:01:16.775806 sshd[5222]: Connection closed by 68.220.241.50 port 52360 Jan 16 18:01:16.779097 sshd-session[5218]: pam_unix(sshd:session): session closed for user core Jan 16 18:01:16.783000 audit[5218]: USER_END pid=5218 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:16.783000 audit[5218]: CRED_DISP pid=5218 uid=0 auid=500 ses=17 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:16.787577 systemd[1]: sshd@17-49.12.189.56:22-68.220.241.50:52360.service: Deactivated successfully. Jan 16 18:01:16.787000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@17-49.12.189.56:22-68.220.241.50:52360 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:01:16.792635 systemd[1]: session-17.scope: Deactivated successfully. Jan 16 18:01:16.796735 systemd-logind[1532]: Session 17 logged out. Waiting for processes to exit. Jan 16 18:01:16.798247 systemd-logind[1532]: Removed session 17. Jan 16 18:01:16.884000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-49.12.189.56:22-68.220.241.50:52372 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:01:16.884421 systemd[1]: Started sshd@18-49.12.189.56:22-68.220.241.50:52372.service - OpenSSH per-connection server daemon (68.220.241.50:52372). Jan 16 18:01:17.443000 audit[5232]: USER_ACCT pid=5232 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:17.444169 sshd[5232]: Accepted publickey for core from 68.220.241.50 port 52372 ssh2: RSA SHA256:9+z26MN7AwYt4+Fu/36id5j/He/wK/R0J/NBrgFPGqc Jan 16 18:01:17.445000 audit[5232]: CRED_ACQ pid=5232 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:17.445000 audit[5232]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe443ca60 a2=3 a3=0 items=0 ppid=1 pid=5232 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=18 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:17.445000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:01:17.447489 sshd-session[5232]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:01:17.455794 systemd-logind[1532]: New session 18 of user core. Jan 16 18:01:17.462757 systemd[1]: Started session-18.scope - Session 18 of User core. Jan 16 18:01:17.466000 audit[5232]: USER_START pid=5232 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:17.470000 audit[5236]: CRED_ACQ pid=5236 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:17.748241 kubelet[2825]: E0116 18:01:17.747843 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-k622k" podUID="e2b5c1b0-147f-4d63-9d4a-0d66468ad158" Jan 16 18:01:18.582000 audit[5246]: NETFILTER_CFG table=filter:142 family=2 entries=26 op=nft_register_rule pid=5246 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:01:18.582000 audit[5246]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=ffffde030f70 a2=0 a3=1 items=0 ppid=2976 pid=5246 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:18.582000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:01:18.589000 audit[5246]: NETFILTER_CFG table=nat:143 family=2 entries=20 op=nft_register_rule pid=5246 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:01:18.589000 audit[5246]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=ffffde030f70 a2=0 a3=1 items=0 ppid=2976 pid=5246 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:18.589000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:01:18.608017 sshd[5236]: Connection closed by 68.220.241.50 port 52372 Jan 16 18:01:18.608949 sshd-session[5232]: pam_unix(sshd:session): session closed for user core Jan 16 18:01:18.612000 audit[5232]: USER_END pid=5232 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:18.612000 audit[5232]: CRED_DISP pid=5232 uid=0 auid=500 ses=18 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:18.617191 systemd[1]: sshd@18-49.12.189.56:22-68.220.241.50:52372.service: Deactivated successfully. Jan 16 18:01:18.617000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@18-49.12.189.56:22-68.220.241.50:52372 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:01:18.620697 systemd[1]: session-18.scope: Deactivated successfully. Jan 16 18:01:18.622424 systemd-logind[1532]: Session 18 logged out. Waiting for processes to exit. Jan 16 18:01:18.624761 systemd-logind[1532]: Removed session 18. Jan 16 18:01:18.729863 systemd[1]: Started sshd@19-49.12.189.56:22-68.220.241.50:52382.service - OpenSSH per-connection server daemon (68.220.241.50:52382). Jan 16 18:01:18.729000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-49.12.189.56:22-68.220.241.50:52382 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:01:19.323000 audit[5251]: USER_ACCT pid=5251 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:19.324618 sshd[5251]: Accepted publickey for core from 68.220.241.50 port 52382 ssh2: RSA SHA256:9+z26MN7AwYt4+Fu/36id5j/He/wK/R0J/NBrgFPGqc Jan 16 18:01:19.328000 audit[5251]: CRED_ACQ pid=5251 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:19.328000 audit[5251]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffed8217c0 a2=3 a3=0 items=0 ppid=1 pid=5251 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=19 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:19.328000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:01:19.331812 sshd-session[5251]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:01:19.338851 systemd-logind[1532]: New session 19 of user core. Jan 16 18:01:19.344793 systemd[1]: Started session-19.scope - Session 19 of User core. Jan 16 18:01:19.349000 audit[5251]: USER_START pid=5251 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:19.352000 audit[5255]: CRED_ACQ pid=5255 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:19.663597 kernel: kauditd_printk_skb: 37 callbacks suppressed Jan 16 18:01:19.663722 kernel: audit: type=1325 audit(1768586479.661:835): table=filter:144 family=2 entries=38 op=nft_register_rule pid=5262 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:01:19.661000 audit[5262]: NETFILTER_CFG table=filter:144 family=2 entries=38 op=nft_register_rule pid=5262 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:01:19.661000 audit[5262]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=fffff9fd3590 a2=0 a3=1 items=0 ppid=2976 pid=5262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:19.667738 kernel: audit: type=1300 audit(1768586479.661:835): arch=c00000b7 syscall=211 success=yes exit=14176 a0=3 a1=fffff9fd3590 a2=0 a3=1 items=0 ppid=2976 pid=5262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:19.661000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:01:19.669174 kernel: audit: type=1327 audit(1768586479.661:835): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:01:19.670000 audit[5262]: NETFILTER_CFG table=nat:145 family=2 entries=20 op=nft_register_rule pid=5262 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:01:19.670000 audit[5262]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=fffff9fd3590 a2=0 a3=1 items=0 ppid=2976 pid=5262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:19.674703 kernel: audit: type=1325 audit(1768586479.670:836): table=nat:145 family=2 entries=20 op=nft_register_rule pid=5262 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:01:19.674835 kernel: audit: type=1300 audit(1768586479.670:836): arch=c00000b7 syscall=211 success=yes exit=5772 a0=3 a1=fffff9fd3590 a2=0 a3=1 items=0 ppid=2976 pid=5262 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:19.674866 kernel: audit: type=1327 audit(1768586479.670:836): proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:01:19.670000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:01:19.893062 sshd[5255]: Connection closed by 68.220.241.50 port 52382 Jan 16 18:01:19.893736 sshd-session[5251]: pam_unix(sshd:session): session closed for user core Jan 16 18:01:19.895000 audit[5251]: USER_END pid=5251 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:19.895000 audit[5251]: CRED_DISP pid=5251 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:19.904698 kernel: audit: type=1106 audit(1768586479.895:837): pid=5251 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:19.904794 kernel: audit: type=1104 audit(1768586479.895:838): pid=5251 uid=0 auid=500 ses=19 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:19.904000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-49.12.189.56:22-68.220.241.50:52382 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:01:19.905264 systemd[1]: sshd@19-49.12.189.56:22-68.220.241.50:52382.service: Deactivated successfully. Jan 16 18:01:19.907448 kernel: audit: type=1131 audit(1768586479.904:839): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@19-49.12.189.56:22-68.220.241.50:52382 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:01:19.908297 systemd[1]: session-19.scope: Deactivated successfully. Jan 16 18:01:19.909607 systemd-logind[1532]: Session 19 logged out. Waiting for processes to exit. Jan 16 18:01:19.913573 systemd-logind[1532]: Removed session 19. Jan 16 18:01:20.000755 systemd[1]: Started sshd@20-49.12.189.56:22-68.220.241.50:52386.service - OpenSSH per-connection server daemon (68.220.241.50:52386). Jan 16 18:01:20.000000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-49.12.189.56:22-68.220.241.50:52386 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:01:20.009582 kernel: audit: type=1130 audit(1768586480.000:840): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-49.12.189.56:22-68.220.241.50:52386 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:01:20.566000 audit[5269]: USER_ACCT pid=5269 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:20.567718 sshd[5269]: Accepted publickey for core from 68.220.241.50 port 52386 ssh2: RSA SHA256:9+z26MN7AwYt4+Fu/36id5j/He/wK/R0J/NBrgFPGqc Jan 16 18:01:20.570000 audit[5269]: CRED_ACQ pid=5269 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:20.570000 audit[5269]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffd65ff680 a2=3 a3=0 items=0 ppid=1 pid=5269 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=20 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:20.570000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:01:20.572119 sshd-session[5269]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:01:20.580059 systemd-logind[1532]: New session 20 of user core. Jan 16 18:01:20.588952 systemd[1]: Started session-20.scope - Session 20 of User core. Jan 16 18:01:20.593000 audit[5269]: USER_START pid=5269 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:20.595000 audit[5273]: CRED_ACQ pid=5273 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:20.750994 kubelet[2825]: E0116 18:01:20.750927 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lqf7c" podUID="8f14bccb-f353-467f-b549-674ae9114a0e" Jan 16 18:01:20.956595 sshd[5273]: Connection closed by 68.220.241.50 port 52386 Jan 16 18:01:20.956605 sshd-session[5269]: pam_unix(sshd:session): session closed for user core Jan 16 18:01:20.961000 audit[5269]: USER_END pid=5269 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:20.961000 audit[5269]: CRED_DISP pid=5269 uid=0 auid=500 ses=20 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:20.965158 systemd[1]: sshd@20-49.12.189.56:22-68.220.241.50:52386.service: Deactivated successfully. Jan 16 18:01:20.966000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@20-49.12.189.56:22-68.220.241.50:52386 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:01:20.968533 systemd[1]: session-20.scope: Deactivated successfully. Jan 16 18:01:20.973003 systemd-logind[1532]: Session 20 logged out. Waiting for processes to exit. Jan 16 18:01:20.974916 systemd-logind[1532]: Removed session 20. Jan 16 18:01:22.748476 kubelet[2825]: E0116 18:01:22.748371 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-pqdht" podUID="2eb1812e-29cb-4b14-8060-fe2f9a701433" Jan 16 18:01:22.918000 audit[5284]: NETFILTER_CFG table=filter:146 family=2 entries=26 op=nft_register_rule pid=5284 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:01:22.918000 audit[5284]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=5248 a0=3 a1=ffffcf73bcc0 a2=0 a3=1 items=0 ppid=2976 pid=5284 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:22.918000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:01:22.925000 audit[5284]: NETFILTER_CFG table=nat:147 family=2 entries=104 op=nft_register_chain pid=5284 subj=system_u:system_r:kernel_t:s0 comm="iptables-restor" Jan 16 18:01:22.925000 audit[5284]: SYSCALL arch=c00000b7 syscall=211 success=yes exit=48684 a0=3 a1=ffffcf73bcc0 a2=0 a3=1 items=0 ppid=2976 pid=5284 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="iptables-restor" exe="/usr/sbin/xtables-nft-multi" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:22.925000 audit: PROCTITLE proctitle=69707461626C65732D726573746F7265002D770035002D2D6E6F666C757368002D2D636F756E74657273 Jan 16 18:01:25.749174 kubelet[2825]: E0116 18:01:25.749071 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-67547c9f8f-2pf9b" podUID="78e68302-7ac2-49f4-8488-479ee8eaf4c9" Jan 16 18:01:26.069000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-49.12.189.56:22-68.220.241.50:53624 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:01:26.069782 systemd[1]: Started sshd@21-49.12.189.56:22-68.220.241.50:53624.service - OpenSSH per-connection server daemon (68.220.241.50:53624). Jan 16 18:01:26.072211 kernel: kauditd_printk_skb: 16 callbacks suppressed Jan 16 18:01:26.072262 kernel: audit: type=1130 audit(1768586486.069:851): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-49.12.189.56:22-68.220.241.50:53624 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:01:26.616000 audit[5286]: USER_ACCT pid=5286 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:26.617734 sshd[5286]: Accepted publickey for core from 68.220.241.50 port 53624 ssh2: RSA SHA256:9+z26MN7AwYt4+Fu/36id5j/He/wK/R0J/NBrgFPGqc Jan 16 18:01:26.621000 audit[5286]: CRED_ACQ pid=5286 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:26.622990 sshd-session[5286]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:01:26.628049 kernel: audit: type=1101 audit(1768586486.616:852): pid=5286 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:26.628175 kernel: audit: type=1103 audit(1768586486.621:853): pid=5286 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:26.629944 kernel: audit: type=1006 audit(1768586486.621:854): pid=5286 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=21 res=1 Jan 16 18:01:26.621000 audit[5286]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe70a92f0 a2=3 a3=0 items=0 ppid=1 pid=5286 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:26.630818 kernel: audit: type=1300 audit(1768586486.621:854): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=ffffe70a92f0 a2=3 a3=0 items=0 ppid=1 pid=5286 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=21 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:26.621000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:01:26.634345 kernel: audit: type=1327 audit(1768586486.621:854): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:01:26.635722 systemd-logind[1532]: New session 21 of user core. Jan 16 18:01:26.643824 systemd[1]: Started session-21.scope - Session 21 of User core. Jan 16 18:01:26.649000 audit[5286]: USER_START pid=5286 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:26.654000 audit[5290]: CRED_ACQ pid=5290 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:26.657614 kernel: audit: type=1105 audit(1768586486.649:855): pid=5286 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:26.657731 kernel: audit: type=1103 audit(1768586486.654:856): pid=5290 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:27.021646 sshd[5290]: Connection closed by 68.220.241.50 port 53624 Jan 16 18:01:27.023115 sshd-session[5286]: pam_unix(sshd:session): session closed for user core Jan 16 18:01:27.025000 audit[5286]: USER_END pid=5286 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:27.031232 systemd[1]: sshd@21-49.12.189.56:22-68.220.241.50:53624.service: Deactivated successfully. Jan 16 18:01:27.033958 kernel: audit: type=1106 audit(1768586487.025:857): pid=5286 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:27.034057 kernel: audit: type=1104 audit(1768586487.026:858): pid=5286 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:27.026000 audit[5286]: CRED_DISP pid=5286 uid=0 auid=500 ses=21 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:27.033714 systemd[1]: session-21.scope: Deactivated successfully. Jan 16 18:01:27.031000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@21-49.12.189.56:22-68.220.241.50:53624 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:01:27.037427 systemd-logind[1532]: Session 21 logged out. Waiting for processes to exit. Jan 16 18:01:27.039069 systemd-logind[1532]: Removed session 21. Jan 16 18:01:29.748927 kubelet[2825]: E0116 18:01:29.748865 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-k622k" podUID="e2b5c1b0-147f-4d63-9d4a-0d66468ad158" Jan 16 18:01:30.748764 kubelet[2825]: E0116 18:01:30.748713 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-94bfc95c-f6gcl" podUID="c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a" Jan 16 18:01:31.748248 kubelet[2825]: E0116 18:01:31.748166 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-xk99t" podUID="d584925c-30a0-4760-b407-e5a8a40d9f3d" Jan 16 18:01:32.132524 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 16 18:01:32.132659 kernel: audit: type=1130 audit(1768586492.128:860): pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-49.12.189.56:22-68.220.241.50:53626 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:01:32.128000 audit[1]: SERVICE_START pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-49.12.189.56:22-68.220.241.50:53626 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:01:32.130244 systemd[1]: Started sshd@22-49.12.189.56:22-68.220.241.50:53626.service - OpenSSH per-connection server daemon (68.220.241.50:53626). Jan 16 18:01:32.661000 audit[5326]: USER_ACCT pid=5326 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:32.665014 sshd[5326]: Accepted publickey for core from 68.220.241.50 port 53626 ssh2: RSA SHA256:9+z26MN7AwYt4+Fu/36id5j/He/wK/R0J/NBrgFPGqc Jan 16 18:01:32.665000 audit[5326]: CRED_ACQ pid=5326 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:32.669473 kernel: audit: type=1101 audit(1768586492.661:861): pid=5326 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:accounting grantors=pam_access,pam_time,pam_unix,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:32.669543 kernel: audit: type=1103 audit(1768586492.665:862): pid=5326 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:32.670993 kernel: audit: type=1006 audit(1768586492.665:863): pid=5326 uid=0 subj=system_u:system_r:kernel_t:s0 old-auid=4294967295 auid=500 tty=(none) old-ses=4294967295 ses=22 res=1 Jan 16 18:01:32.669717 sshd-session[5326]: pam_unix(sshd:session): session opened for user core(uid=500) by core(uid=0) Jan 16 18:01:32.674990 kernel: audit: type=1300 audit(1768586492.665:863): arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff8a78e30 a2=3 a3=0 items=0 ppid=1 pid=5326 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:32.665000 audit[5326]: SYSCALL arch=c00000b7 syscall=64 success=yes exit=3 a0=8 a1=fffff8a78e30 a2=3 a3=0 items=0 ppid=1 pid=5326 auid=500 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=22 comm="sshd-session" exe="/usr/lib64/misc/sshd-session" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:32.678173 kernel: audit: type=1327 audit(1768586492.665:863): proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:01:32.665000 audit: PROCTITLE proctitle=737368642D73657373696F6E3A20636F7265205B707269765D Jan 16 18:01:32.683616 systemd-logind[1532]: New session 22 of user core. Jan 16 18:01:32.688039 systemd[1]: Started session-22.scope - Session 22 of User core. Jan 16 18:01:32.691000 audit[5326]: USER_START pid=5326 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:32.694000 audit[5330]: CRED_ACQ pid=5330 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:32.698472 kernel: audit: type=1105 audit(1768586492.691:864): pid=5326 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_open grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:32.698541 kernel: audit: type=1103 audit(1768586492.694:865): pid=5330 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:33.051799 sshd[5330]: Connection closed by 68.220.241.50 port 53626 Jan 16 18:01:33.052780 sshd-session[5326]: pam_unix(sshd:session): session closed for user core Jan 16 18:01:33.053000 audit[5326]: USER_END pid=5326 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:33.053000 audit[5326]: CRED_DISP pid=5326 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:33.062829 kernel: audit: type=1106 audit(1768586493.053:866): pid=5326 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:session_close grantors=pam_loginuid,pam_env,pam_namespace,pam_keyinit,pam_limits,pam_env,pam_umask,pam_unix,pam_systemd,pam_lastlog,pam_mail acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:33.062907 kernel: audit: type=1104 audit(1768586493.053:867): pid=5326 uid=0 auid=500 ses=22 subj=system_u:system_r:kernel_t:s0 msg='op=PAM:setcred grantors=pam_env,pam_faillock acct="core" exe="/usr/lib64/misc/sshd-session" hostname=68.220.241.50 addr=68.220.241.50 terminal=ssh res=success' Jan 16 18:01:33.061782 systemd[1]: sshd@22-49.12.189.56:22-68.220.241.50:53626.service: Deactivated successfully. Jan 16 18:01:33.062000 audit[1]: SERVICE_STOP pid=1 uid=0 auid=4294967295 ses=4294967295 subj=system_u:system_r:kernel_t:s0 msg='unit=sshd@22-49.12.189.56:22-68.220.241.50:53626 comm="systemd" exe="/usr/lib/systemd/systemd" hostname=? addr=? terminal=? res=success' Jan 16 18:01:33.067862 systemd[1]: session-22.scope: Deactivated successfully. Jan 16 18:01:33.071047 systemd-logind[1532]: Session 22 logged out. Waiting for processes to exit. Jan 16 18:01:33.073303 systemd-logind[1532]: Removed session 22. Jan 16 18:01:33.751609 kubelet[2825]: E0116 18:01:33.749273 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lqf7c" podUID="8f14bccb-f353-467f-b549-674ae9114a0e" Jan 16 18:01:37.748574 kubelet[2825]: E0116 18:01:37.748483 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-pqdht" podUID="2eb1812e-29cb-4b14-8060-fe2f9a701433" Jan 16 18:01:39.749076 kubelet[2825]: E0116 18:01:39.748996 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-67547c9f8f-2pf9b" podUID="78e68302-7ac2-49f4-8488-479ee8eaf4c9" Jan 16 18:01:42.748514 kubelet[2825]: E0116 18:01:42.748455 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-k622k" podUID="e2b5c1b0-147f-4d63-9d4a-0d66468ad158" Jan 16 18:01:42.749750 kubelet[2825]: E0116 18:01:42.749651 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-kube-controllers\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/kube-controllers:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/kube-controllers:v3.30.4: not found\"" pod="calico-system/calico-kube-controllers-94bfc95c-f6gcl" podUID="c9307e86-6ef4-4f5e-8c8d-f9f21c28f28a" Jan 16 18:01:44.747346 kubelet[2825]: E0116 18:01:44.747223 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"calico-apiserver\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/apiserver:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/apiserver:v3.30.4: not found\"" pod="calico-apiserver/calico-apiserver-767f75bcf4-xk99t" podUID="d584925c-30a0-4760-b407-e5a8a40d9f3d" Jan 16 18:01:46.747856 kubelet[2825]: E0116 18:01:46.747679 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"calico-csi\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/csi:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/csi:v3.30.4: not found\", failed to \"StartContainer\" for \"csi-node-driver-registrar\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/node-driver-registrar:v3.30.4: not found\"]" pod="calico-system/csi-node-driver-lqf7c" podUID="8f14bccb-f353-467f-b549-674ae9114a0e" Jan 16 18:01:48.032055 systemd[1]: cri-containerd-f0fdedc89d7e27154ad2d04835c01ba181c0c01a75589e5f18ecf1f7f26e605f.scope: Deactivated successfully. Jan 16 18:01:48.032623 systemd[1]: cri-containerd-f0fdedc89d7e27154ad2d04835c01ba181c0c01a75589e5f18ecf1f7f26e605f.scope: Consumed 5.518s CPU time, 63.9M memory peak, 2.3M read from disk. Jan 16 18:01:48.038360 kernel: kauditd_printk_skb: 1 callbacks suppressed Jan 16 18:01:48.038488 kernel: audit: type=1334 audit(1768586508.035:869): prog-id=101 op=UNLOAD Jan 16 18:01:48.038684 kernel: audit: type=1334 audit(1768586508.035:870): prog-id=105 op=UNLOAD Jan 16 18:01:48.035000 audit: BPF prog-id=101 op=UNLOAD Jan 16 18:01:48.035000 audit: BPF prog-id=105 op=UNLOAD Jan 16 18:01:48.037000 audit: BPF prog-id=254 op=LOAD Jan 16 18:01:48.039866 kernel: audit: type=1334 audit(1768586508.037:871): prog-id=254 op=LOAD Jan 16 18:01:48.037000 audit: BPF prog-id=81 op=UNLOAD Jan 16 18:01:48.040658 kernel: audit: type=1334 audit(1768586508.037:872): prog-id=81 op=UNLOAD Jan 16 18:01:48.040969 containerd[1547]: time="2026-01-16T18:01:48.040903167Z" level=info msg="received container exit event container_id:\"f0fdedc89d7e27154ad2d04835c01ba181c0c01a75589e5f18ecf1f7f26e605f\" id:\"f0fdedc89d7e27154ad2d04835c01ba181c0c01a75589e5f18ecf1f7f26e605f\" pid:2668 exit_status:1 exited_at:{seconds:1768586508 nanos:40403129}" Jan 16 18:01:48.070994 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-f0fdedc89d7e27154ad2d04835c01ba181c0c01a75589e5f18ecf1f7f26e605f-rootfs.mount: Deactivated successfully. Jan 16 18:01:48.318418 kubelet[2825]: E0116 18:01:48.318279 2825 controller.go:195] "Failed to update lease" err="rpc error: code = Unavailable desc = error reading from server: read tcp 10.0.0.3:52014->10.0.0.2:2379: read: connection timed out" Jan 16 18:01:48.580784 kubelet[2825]: I0116 18:01:48.580499 2825 scope.go:117] "RemoveContainer" containerID="f0fdedc89d7e27154ad2d04835c01ba181c0c01a75589e5f18ecf1f7f26e605f" Jan 16 18:01:48.583590 containerd[1547]: time="2026-01-16T18:01:48.583524569Z" level=info msg="CreateContainer within sandbox \"94d4090256e102b4a5214e250eff5e505fc432bb6d2834071eb4209a295f4a11\" for container &ContainerMetadata{Name:kube-controller-manager,Attempt:1,}" Jan 16 18:01:48.595573 containerd[1547]: time="2026-01-16T18:01:48.594391800Z" level=info msg="Container 9c5cac4c25521eb22bdc50455ee22481d1a2f53bbed9e93671e7db66d8ad8e23: CDI devices from CRI Config.CDIDevices: []" Jan 16 18:01:48.600864 systemd[1]: var-lib-containerd-tmpmounts-containerd\x2dmount677095229.mount: Deactivated successfully. Jan 16 18:01:48.603982 containerd[1547]: time="2026-01-16T18:01:48.603855477Z" level=info msg="CreateContainer within sandbox \"94d4090256e102b4a5214e250eff5e505fc432bb6d2834071eb4209a295f4a11\" for &ContainerMetadata{Name:kube-controller-manager,Attempt:1,} returns container id \"9c5cac4c25521eb22bdc50455ee22481d1a2f53bbed9e93671e7db66d8ad8e23\"" Jan 16 18:01:48.604614 containerd[1547]: time="2026-01-16T18:01:48.604574673Z" level=info msg="StartContainer for \"9c5cac4c25521eb22bdc50455ee22481d1a2f53bbed9e93671e7db66d8ad8e23\"" Jan 16 18:01:48.605929 containerd[1547]: time="2026-01-16T18:01:48.605893547Z" level=info msg="connecting to shim 9c5cac4c25521eb22bdc50455ee22481d1a2f53bbed9e93671e7db66d8ad8e23" address="unix:///run/containerd/s/93c92de28a9bcadde16c5fafc224e5e545fe0280eecbb9dd8326357dd6dcddd6" protocol=ttrpc version=3 Jan 16 18:01:48.630036 systemd[1]: Started cri-containerd-9c5cac4c25521eb22bdc50455ee22481d1a2f53bbed9e93671e7db66d8ad8e23.scope - libcontainer container 9c5cac4c25521eb22bdc50455ee22481d1a2f53bbed9e93671e7db66d8ad8e23. Jan 16 18:01:48.643000 audit: BPF prog-id=255 op=LOAD Jan 16 18:01:48.647710 kernel: audit: type=1334 audit(1768586508.643:873): prog-id=255 op=LOAD Jan 16 18:01:48.647848 kernel: audit: type=1334 audit(1768586508.645:874): prog-id=256 op=LOAD Jan 16 18:01:48.645000 audit: BPF prog-id=256 op=LOAD Jan 16 18:01:48.651529 kernel: audit: type=1300 audit(1768586508.645:874): arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2502 pid=5366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:48.651683 kernel: audit: type=1327 audit(1768586508.645:874): proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3963356361633463323535323165623232626463353034353565653232 Jan 16 18:01:48.645000 audit[5366]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178180 a2=98 a3=0 items=0 ppid=2502 pid=5366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:48.645000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3963356361633463323535323165623232626463353034353565653232 Jan 16 18:01:48.645000 audit: BPF prog-id=256 op=UNLOAD Jan 16 18:01:48.645000 audit[5366]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2502 pid=5366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:48.656926 kernel: audit: type=1334 audit(1768586508.645:875): prog-id=256 op=UNLOAD Jan 16 18:01:48.657029 kernel: audit: type=1300 audit(1768586508.645:875): arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2502 pid=5366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:48.645000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3963356361633463323535323165623232626463353034353565653232 Jan 16 18:01:48.648000 audit: BPF prog-id=257 op=LOAD Jan 16 18:01:48.648000 audit[5366]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=40001783e8 a2=98 a3=0 items=0 ppid=2502 pid=5366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:48.648000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3963356361633463323535323165623232626463353034353565653232 Jan 16 18:01:48.653000 audit: BPF prog-id=258 op=LOAD Jan 16 18:01:48.653000 audit[5366]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=4000178168 a2=98 a3=0 items=0 ppid=2502 pid=5366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:48.653000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3963356361633463323535323165623232626463353034353565653232 Jan 16 18:01:48.653000 audit: BPF prog-id=258 op=UNLOAD Jan 16 18:01:48.653000 audit[5366]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2502 pid=5366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:48.653000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3963356361633463323535323165623232626463353034353565653232 Jan 16 18:01:48.653000 audit: BPF prog-id=257 op=UNLOAD Jan 16 18:01:48.653000 audit[5366]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2502 pid=5366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:48.653000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3963356361633463323535323165623232626463353034353565653232 Jan 16 18:01:48.653000 audit: BPF prog-id=259 op=LOAD Jan 16 18:01:48.653000 audit[5366]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=4000178648 a2=98 a3=0 items=0 ppid=2502 pid=5366 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:48.653000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3963356361633463323535323165623232626463353034353565653232 Jan 16 18:01:48.694834 containerd[1547]: time="2026-01-16T18:01:48.694776262Z" level=info msg="StartContainer for \"9c5cac4c25521eb22bdc50455ee22481d1a2f53bbed9e93671e7db66d8ad8e23\" returns successfully" Jan 16 18:01:48.998417 systemd[1]: cri-containerd-96022609c912762dfff72d226954a7f0c55cf4867f4800d07139f6954c4a6208.scope: Deactivated successfully. Jan 16 18:01:48.999019 systemd[1]: cri-containerd-96022609c912762dfff72d226954a7f0c55cf4867f4800d07139f6954c4a6208.scope: Consumed 38.259s CPU time, 122.6M memory peak. Jan 16 18:01:49.000802 containerd[1547]: time="2026-01-16T18:01:49.000068308Z" level=info msg="received container exit event container_id:\"96022609c912762dfff72d226954a7f0c55cf4867f4800d07139f6954c4a6208\" id:\"96022609c912762dfff72d226954a7f0c55cf4867f4800d07139f6954c4a6208\" pid:3149 exit_status:1 exited_at:{seconds:1768586508 nanos:999463951}" Jan 16 18:01:49.001000 audit: BPF prog-id=144 op=UNLOAD Jan 16 18:01:49.001000 audit: BPF prog-id=148 op=UNLOAD Jan 16 18:01:49.032722 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-96022609c912762dfff72d226954a7f0c55cf4867f4800d07139f6954c4a6208-rootfs.mount: Deactivated successfully. Jan 16 18:01:49.582031 kubelet[2825]: I0116 18:01:49.581997 2825 scope.go:117] "RemoveContainer" containerID="96022609c912762dfff72d226954a7f0c55cf4867f4800d07139f6954c4a6208" Jan 16 18:01:49.584136 containerd[1547]: time="2026-01-16T18:01:49.584083641Z" level=info msg="CreateContainer within sandbox \"83e73ffb7c35851edf6292115e6e9c80652f701dbafebc043ff97d40fca5f555\" for container &ContainerMetadata{Name:tigera-operator,Attempt:1,}" Jan 16 18:01:49.592652 containerd[1547]: time="2026-01-16T18:01:49.592609126Z" level=info msg="Container 2205a434f8a2eba199bb8ffeb76bb0f371e305e95d6b154f94573beefcc54ce3: CDI devices from CRI Config.CDIDevices: []" Jan 16 18:01:49.600450 containerd[1547]: time="2026-01-16T18:01:49.600381094Z" level=info msg="CreateContainer within sandbox \"83e73ffb7c35851edf6292115e6e9c80652f701dbafebc043ff97d40fca5f555\" for &ContainerMetadata{Name:tigera-operator,Attempt:1,} returns container id \"2205a434f8a2eba199bb8ffeb76bb0f371e305e95d6b154f94573beefcc54ce3\"" Jan 16 18:01:49.601203 containerd[1547]: time="2026-01-16T18:01:49.601149131Z" level=info msg="StartContainer for \"2205a434f8a2eba199bb8ffeb76bb0f371e305e95d6b154f94573beefcc54ce3\"" Jan 16 18:01:49.604009 containerd[1547]: time="2026-01-16T18:01:49.603615201Z" level=info msg="connecting to shim 2205a434f8a2eba199bb8ffeb76bb0f371e305e95d6b154f94573beefcc54ce3" address="unix:///run/containerd/s/f7dc221116a2434887973625e1d1eeb87dd55bda2462bf0c6dc3bbb8d5af58cf" protocol=ttrpc version=3 Jan 16 18:01:49.629802 systemd[1]: Started cri-containerd-2205a434f8a2eba199bb8ffeb76bb0f371e305e95d6b154f94573beefcc54ce3.scope - libcontainer container 2205a434f8a2eba199bb8ffeb76bb0f371e305e95d6b154f94573beefcc54ce3. Jan 16 18:01:49.649000 audit: BPF prog-id=260 op=LOAD Jan 16 18:01:49.649000 audit: BPF prog-id=261 op=LOAD Jan 16 18:01:49.649000 audit[5406]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=400010c180 a2=98 a3=0 items=0 ppid=2883 pid=5406 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:49.649000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3232303561343334663861326562613139396262386666656237366262 Jan 16 18:01:49.649000 audit: BPF prog-id=261 op=UNLOAD Jan 16 18:01:49.649000 audit[5406]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2883 pid=5406 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:49.649000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3232303561343334663861326562613139396262386666656237366262 Jan 16 18:01:49.650000 audit: BPF prog-id=262 op=LOAD Jan 16 18:01:49.650000 audit[5406]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=400010c3e8 a2=98 a3=0 items=0 ppid=2883 pid=5406 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:49.650000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3232303561343334663861326562613139396262386666656237366262 Jan 16 18:01:49.650000 audit: BPF prog-id=263 op=LOAD Jan 16 18:01:49.650000 audit[5406]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=23 a0=5 a1=400010c168 a2=98 a3=0 items=0 ppid=2883 pid=5406 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:49.650000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3232303561343334663861326562613139396262386666656237366262 Jan 16 18:01:49.650000 audit: BPF prog-id=263 op=UNLOAD Jan 16 18:01:49.650000 audit[5406]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=17 a1=0 a2=0 a3=0 items=0 ppid=2883 pid=5406 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:49.650000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3232303561343334663861326562613139396262386666656237366262 Jan 16 18:01:49.650000 audit: BPF prog-id=262 op=UNLOAD Jan 16 18:01:49.650000 audit[5406]: SYSCALL arch=c00000b7 syscall=57 success=yes exit=0 a0=15 a1=0 a2=0 a3=0 items=0 ppid=2883 pid=5406 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:49.650000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3232303561343334663861326562613139396262386666656237366262 Jan 16 18:01:49.650000 audit: BPF prog-id=264 op=LOAD Jan 16 18:01:49.650000 audit[5406]: SYSCALL arch=c00000b7 syscall=280 success=yes exit=21 a0=5 a1=400010c648 a2=98 a3=0 items=0 ppid=2883 pid=5406 auid=4294967295 uid=0 gid=0 euid=0 suid=0 fsuid=0 egid=0 sgid=0 fsgid=0 tty=(none) ses=4294967295 comm="runc" exe="/usr/bin/runc" subj=system_u:system_r:kernel_t:s0 key=(null) Jan 16 18:01:49.650000 audit: PROCTITLE proctitle=72756E63002D2D726F6F74002F72756E2F636F6E7461696E6572642F72756E632F6B38732E696F002D2D6C6F67002F72756E2F636F6E7461696E6572642F696F2E636F6E7461696E6572642E72756E74696D652E76322E7461736B2F6B38732E696F2F3232303561343334663861326562613139396262386666656237366262 Jan 16 18:01:49.672571 containerd[1547]: time="2026-01-16T18:01:49.671523283Z" level=info msg="StartContainer for \"2205a434f8a2eba199bb8ffeb76bb0f371e305e95d6b154f94573beefcc54ce3\" returns successfully" Jan 16 18:01:51.748534 containerd[1547]: time="2026-01-16T18:01:51.748227167Z" level=info msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\"" Jan 16 18:01:52.341938 containerd[1547]: time="2026-01-16T18:01:52.341865012Z" level=info msg="fetch failed after status: 404 Not Found" host=ghcr.io Jan 16 18:01:52.344420 containerd[1547]: time="2026-01-16T18:01:52.344353125Z" level=error msg="PullImage \"ghcr.io/flatcar/calico/whisker:v3.30.4\" failed" error="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" Jan 16 18:01:52.344563 containerd[1547]: time="2026-01-16T18:01:52.344483965Z" level=info msg="stop pulling image ghcr.io/flatcar/calico/whisker:v3.30.4: active requests=0, bytes read=0" Jan 16 18:01:52.344835 kubelet[2825]: E0116 18:01:52.344792 2825 log.go:32] "PullImage from image service failed" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 18:01:52.345487 kubelet[2825]: E0116 18:01:52.345200 2825 kuberuntime_image.go:43] "Failed to pull image" err="rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" image="ghcr.io/flatcar/calico/whisker:v3.30.4" Jan 16 18:01:52.345487 kubelet[2825]: E0116 18:01:52.345315 2825 kuberuntime_manager.go:1449] "Unhandled Error" err="container whisker start failed in pod whisker-67547c9f8f-2pf9b_calico-system(78e68302-7ac2-49f4-8488-479ee8eaf4c9): ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \"ghcr.io/flatcar/calico/whisker:v3.30.4\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found" logger="UnhandledError" Jan 16 18:01:52.346330 kubelet[2825]: E0116 18:01:52.346255 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="[failed to \"StartContainer\" for \"whisker\" with ErrImagePull: \"rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker:v3.30.4: not found\", failed to \"StartContainer\" for \"whisker-backend\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/whisker-backend:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/whisker-backend:v3.30.4: not found\"]" pod="calico-system/whisker-67547c9f8f-2pf9b" podUID="78e68302-7ac2-49f4-8488-479ee8eaf4c9" Jan 16 18:01:52.747156 kubelet[2825]: E0116 18:01:52.747103 2825 pod_workers.go:1324] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"goldmane\" with ImagePullBackOff: \"Back-off pulling image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": ErrImagePull: rpc error: code = NotFound desc = failed to pull and unpack image \\\"ghcr.io/flatcar/calico/goldmane:v3.30.4\\\": failed to resolve image: ghcr.io/flatcar/calico/goldmane:v3.30.4: not found\"" pod="calico-system/goldmane-7c778bb748-pqdht" podUID="2eb1812e-29cb-4b14-8060-fe2f9a701433" Jan 16 18:01:53.070747 systemd[1]: cri-containerd-a379306ca43099d210df2482973aa85a8fb4d26d8b549e90f591e16eb865c80b.scope: Deactivated successfully. Jan 16 18:01:53.072615 kernel: kauditd_printk_skb: 40 callbacks suppressed Jan 16 18:01:53.072674 kernel: audit: type=1334 audit(1768586513.070:891): prog-id=265 op=LOAD Jan 16 18:01:53.070000 audit: BPF prog-id=265 op=LOAD Jan 16 18:01:53.071087 systemd[1]: cri-containerd-a379306ca43099d210df2482973aa85a8fb4d26d8b549e90f591e16eb865c80b.scope: Consumed 3.484s CPU time, 25M memory peak, 2.6M read from disk. Jan 16 18:01:53.070000 audit: BPF prog-id=91 op=UNLOAD Jan 16 18:01:53.074693 kernel: audit: type=1334 audit(1768586513.070:892): prog-id=91 op=UNLOAD Jan 16 18:01:53.073000 audit: BPF prog-id=106 op=UNLOAD Jan 16 18:01:53.073000 audit: BPF prog-id=110 op=UNLOAD Jan 16 18:01:53.077521 kernel: audit: type=1334 audit(1768586513.073:893): prog-id=106 op=UNLOAD Jan 16 18:01:53.077758 kernel: audit: type=1334 audit(1768586513.073:894): prog-id=110 op=UNLOAD Jan 16 18:01:53.078396 containerd[1547]: time="2026-01-16T18:01:53.078297502Z" level=info msg="received container exit event container_id:\"a379306ca43099d210df2482973aa85a8fb4d26d8b549e90f591e16eb865c80b\" id:\"a379306ca43099d210df2482973aa85a8fb4d26d8b549e90f591e16eb865c80b\" pid:2683 exit_status:1 exited_at:{seconds:1768586513 nanos:74864790}" Jan 16 18:01:53.106156 systemd[1]: run-containerd-io.containerd.runtime.v2.task-k8s.io-a379306ca43099d210df2482973aa85a8fb4d26d8b549e90f591e16eb865c80b-rootfs.mount: Deactivated successfully.